[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 7530 1727096008.91250: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 7530 1727096008.91551: Added group all to inventory 7530 1727096008.91553: Added group ungrouped to inventory 7530 1727096008.91556: Group all now contains ungrouped 7530 1727096008.91559: Examining possible inventory source: /tmp/network-EuO/inventory.yml 7530 1727096009.01673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 7530 1727096009.01715: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 7530 1727096009.01734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 7530 1727096009.01776: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 7530 1727096009.01827: Loaded config def from plugin (inventory/script) 7530 1727096009.01828: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 7530 1727096009.01855: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 7530 1727096009.01913: Loaded config def from plugin (inventory/yaml) 7530 1727096009.01915: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 7530 1727096009.01985: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 7530 1727096009.02263: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 7530 1727096009.02265: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 7530 1727096009.02269: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 7530 1727096009.02274: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 7530 1727096009.02277: Loading data from /tmp/network-EuO/inventory.yml 7530 1727096009.02323: /tmp/network-EuO/inventory.yml was not parsable by auto 7530 1727096009.02364: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 7530 1727096009.02394: Loading data from /tmp/network-EuO/inventory.yml 7530 1727096009.02449: group all already in inventory 7530 1727096009.02454: set inventory_file for managed_node1 7530 1727096009.02457: set inventory_dir for managed_node1 7530 1727096009.02458: Added host managed_node1 to inventory 7530 1727096009.02459: Added host managed_node1 to group all 7530 1727096009.02460: set ansible_host for managed_node1 7530 1727096009.02460: set ansible_ssh_extra_args for managed_node1 7530 1727096009.02462: set inventory_file for managed_node2 7530 1727096009.02464: set inventory_dir for managed_node2 7530 1727096009.02464: Added host managed_node2 to inventory 7530 1727096009.02465: Added host managed_node2 to group all 7530 1727096009.02466: set ansible_host for managed_node2 7530 1727096009.02466: set ansible_ssh_extra_args for managed_node2 7530 1727096009.02470: set inventory_file for managed_node3 7530 1727096009.02471: set inventory_dir for managed_node3 7530 1727096009.02472: Added host managed_node3 to inventory 7530 1727096009.02473: Added host managed_node3 to group all 7530 1727096009.02473: set ansible_host for managed_node3 7530 1727096009.02474: set ansible_ssh_extra_args for managed_node3 7530 1727096009.02475: Reconcile groups and hosts in inventory. 7530 1727096009.02478: Group ungrouped now contains managed_node1 7530 1727096009.02479: Group ungrouped now contains managed_node2 7530 1727096009.02480: Group ungrouped now contains managed_node3 7530 1727096009.02534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 7530 1727096009.02612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 7530 1727096009.02646: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 7530 1727096009.02664: Loaded config def from plugin (vars/host_group_vars) 7530 1727096009.02665: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 7530 1727096009.02672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 7530 1727096009.02678: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 7530 1727096009.02705: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 7530 1727096009.02938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096009.03008: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 7530 1727096009.03033: Loaded config def from plugin (connection/local) 7530 1727096009.03035: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 7530 1727096009.03413: Loaded config def from plugin (connection/paramiko_ssh) 7530 1727096009.03418: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 7530 1727096009.03983: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7530 1727096009.04006: Loaded config def from plugin (connection/psrp) 7530 1727096009.04008: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 7530 1727096009.04419: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7530 1727096009.04443: Loaded config def from plugin (connection/ssh) 7530 1727096009.04444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 7530 1727096009.05724: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7530 1727096009.05747: Loaded config def from plugin (connection/winrm) 7530 1727096009.05749: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 7530 1727096009.05773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 7530 1727096009.05821: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 7530 1727096009.05860: Loaded config def from plugin (shell/cmd) 7530 1727096009.05861: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 7530 1727096009.05880: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 7530 1727096009.05922: Loaded config def from plugin (shell/powershell) 7530 1727096009.05924: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 7530 1727096009.05959: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 7530 1727096009.06063: Loaded config def from plugin (shell/sh) 7530 1727096009.06065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 7530 1727096009.06088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 7530 1727096009.06161: Loaded config def from plugin (become/runas) 7530 1727096009.06163: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 7530 1727096009.06275: Loaded config def from plugin (become/su) 7530 1727096009.06277: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 7530 1727096009.06372: Loaded config def from plugin (become/sudo) 7530 1727096009.06374: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 7530 1727096009.06395: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7530 1727096009.06608: in VariableManager get_vars() 7530 1727096009.06624: done with get_vars() 7530 1727096009.06710: trying /usr/local/lib/python3.12/site-packages/ansible/modules 7530 1727096009.08624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 7530 1727096009.08696: in VariableManager get_vars() 7530 1727096009.08699: done with get_vars() 7530 1727096009.08701: variable 'playbook_dir' from source: magic vars 7530 1727096009.08702: variable 'ansible_playbook_python' from source: magic vars 7530 1727096009.08702: variable 'ansible_config_file' from source: magic vars 7530 1727096009.08703: variable 'groups' from source: magic vars 7530 1727096009.08703: variable 'omit' from source: magic vars 7530 1727096009.08703: variable 'ansible_version' from source: magic vars 7530 1727096009.08704: variable 'ansible_check_mode' from source: magic vars 7530 1727096009.08704: variable 'ansible_diff_mode' from source: magic vars 7530 1727096009.08705: variable 'ansible_forks' from source: magic vars 7530 1727096009.08705: variable 'ansible_inventory_sources' from source: magic vars 7530 1727096009.08706: variable 'ansible_skip_tags' from source: magic vars 7530 1727096009.08706: variable 'ansible_limit' from source: magic vars 7530 1727096009.08706: variable 'ansible_run_tags' from source: magic vars 7530 1727096009.08707: variable 'ansible_verbosity' from source: magic vars 7530 1727096009.08731: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml 7530 1727096009.09124: in VariableManager get_vars() 7530 1727096009.09134: done with get_vars() 7530 1727096009.09156: in VariableManager get_vars() 7530 1727096009.09163: done with get_vars() 7530 1727096009.09188: in VariableManager get_vars() 7530 1727096009.09196: done with get_vars() 7530 1727096009.09284: in VariableManager get_vars() 7530 1727096009.09295: done with get_vars() 7530 1727096009.09299: variable 'omit' from source: magic vars 7530 1727096009.09310: variable 'omit' from source: magic vars 7530 1727096009.09332: in VariableManager get_vars() 7530 1727096009.09339: done with get_vars() 7530 1727096009.09368: in VariableManager get_vars() 7530 1727096009.09377: done with get_vars() 7530 1727096009.09403: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7530 1727096009.09531: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7530 1727096009.09606: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7530 1727096009.09975: in VariableManager get_vars() 7530 1727096009.09987: done with get_vars() 7530 1727096009.10260: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 7530 1727096009.10347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096009.11362: in VariableManager get_vars() 7530 1727096009.11376: done with get_vars() 7530 1727096009.11400: in VariableManager get_vars() 7530 1727096009.11421: done with get_vars() 7530 1727096009.11693: in VariableManager get_vars() 7530 1727096009.11704: done with get_vars() 7530 1727096009.11707: variable 'omit' from source: magic vars 7530 1727096009.11714: variable 'omit' from source: magic vars 7530 1727096009.11734: in VariableManager get_vars() 7530 1727096009.11743: done with get_vars() 7530 1727096009.11756: in VariableManager get_vars() 7530 1727096009.11765: done with get_vars() 7530 1727096009.11785: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7530 1727096009.11850: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7530 1727096009.11893: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7530 1727096009.13362: in VariableManager get_vars() 7530 1727096009.13379: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096009.14821: in VariableManager get_vars() 7530 1727096009.14841: done with get_vars() 7530 1727096009.14946: in VariableManager get_vars() 7530 1727096009.14966: done with get_vars() 7530 1727096009.15021: in VariableManager get_vars() 7530 1727096009.15040: done with get_vars() 7530 1727096009.15044: variable 'omit' from source: magic vars 7530 1727096009.15054: variable 'omit' from source: magic vars 7530 1727096009.15084: in VariableManager get_vars() 7530 1727096009.15098: done with get_vars() 7530 1727096009.15116: in VariableManager get_vars() 7530 1727096009.15136: done with get_vars() 7530 1727096009.15162: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7530 1727096009.15273: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7530 1727096009.15349: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7530 1727096009.15651: in VariableManager get_vars() 7530 1727096009.15669: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096009.16884: in VariableManager get_vars() 7530 1727096009.16898: done with get_vars() 7530 1727096009.16924: in VariableManager get_vars() 7530 1727096009.16936: done with get_vars() 7530 1727096009.17213: in VariableManager get_vars() 7530 1727096009.17241: done with get_vars() 7530 1727096009.17244: variable 'omit' from source: magic vars 7530 1727096009.17255: variable 'omit' from source: magic vars 7530 1727096009.17289: in VariableManager get_vars() 7530 1727096009.17304: done with get_vars() 7530 1727096009.17319: in VariableManager get_vars() 7530 1727096009.17331: done with get_vars() 7530 1727096009.17357: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7530 1727096009.17461: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7530 1727096009.17526: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7530 1727096009.17824: in VariableManager get_vars() 7530 1727096009.17848: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096009.19405: in VariableManager get_vars() 7530 1727096009.19425: done with get_vars() 7530 1727096009.19448: in VariableManager get_vars() 7530 1727096009.19461: done with get_vars() 7530 1727096009.19498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 7530 1727096009.19510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 7530 1727096009.19697: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 7530 1727096009.19788: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 7530 1727096009.19790: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 7530 1727096009.19811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 7530 1727096009.19830: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 7530 1727096009.19929: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 7530 1727096009.19966: Loaded config def from plugin (callback/default) 7530 1727096009.19970: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7530 1727096009.20906: Loaded config def from plugin (callback/junit) 7530 1727096009.20908: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7530 1727096009.20953: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 7530 1727096009.21018: Loaded config def from plugin (callback/minimal) 7530 1727096009.21021: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7530 1727096009.21060: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7530 1727096009.21121: Loaded config def from plugin (callback/tree) 7530 1727096009.21124: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 7530 1727096009.21237: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 7530 1727096009.21239: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_auto_gateway_nm.yml ******************************************** 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7530 1727096009.21269: in VariableManager get_vars() 7530 1727096009.21283: done with get_vars() 7530 1727096009.21288: in VariableManager get_vars() 7530 1727096009.21297: done with get_vars() 7530 1727096009.21301: variable 'omit' from source: magic vars 7530 1727096009.21337: in VariableManager get_vars() 7530 1727096009.21353: done with get_vars() 7530 1727096009.21375: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_auto_gateway.yml' with nm as provider] ***** 7530 1727096009.21900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 7530 1727096009.21965: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 7530 1727096009.22007: getting the remaining hosts for this loop 7530 1727096009.22009: done getting the remaining hosts for this loop 7530 1727096009.22012: getting the next task for host managed_node3 7530 1727096009.22015: done getting next task for host managed_node3 7530 1727096009.22017: ^ task is: TASK: Gathering Facts 7530 1727096009.22018: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096009.22020: getting variables 7530 1727096009.22021: in VariableManager get_vars() 7530 1727096009.22027: Calling all_inventory to load vars for managed_node3 7530 1727096009.22029: Calling groups_inventory to load vars for managed_node3 7530 1727096009.22030: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096009.22039: Calling all_plugins_play to load vars for managed_node3 7530 1727096009.22045: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096009.22047: Calling groups_plugins_play to load vars for managed_node3 7530 1727096009.22071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096009.22107: done with get_vars() 7530 1727096009.22112: done getting variables 7530 1727096009.22187: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 Monday 23 September 2024 08:53:29 -0400 (0:00:00.010) 0:00:00.010 ****** 7530 1727096009.22215: entering _queue_task() for managed_node3/gather_facts 7530 1727096009.22216: Creating lock for gather_facts 7530 1727096009.22471: worker is 1 (out of 1 available) 7530 1727096009.22480: exiting _queue_task() for managed_node3/gather_facts 7530 1727096009.22492: done queuing things up, now waiting for results queue to drain 7530 1727096009.22494: waiting for pending results... 7530 1727096009.22637: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7530 1727096009.22694: in run() - task 0afff68d-5257-086b-f4f0-000000000155 7530 1727096009.22705: variable 'ansible_search_path' from source: unknown 7530 1727096009.22736: calling self._execute() 7530 1727096009.22783: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096009.22788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096009.22796: variable 'omit' from source: magic vars 7530 1727096009.22865: variable 'omit' from source: magic vars 7530 1727096009.22889: variable 'omit' from source: magic vars 7530 1727096009.22913: variable 'omit' from source: magic vars 7530 1727096009.22953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096009.22982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096009.22998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096009.23011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096009.23024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096009.23057: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096009.23060: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096009.23062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096009.23133: Set connection var ansible_pipelining to False 7530 1727096009.23137: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096009.23143: Set connection var ansible_timeout to 10 7530 1727096009.23152: Set connection var ansible_shell_executable to /bin/sh 7530 1727096009.23155: Set connection var ansible_shell_type to sh 7530 1727096009.23157: Set connection var ansible_connection to ssh 7530 1727096009.23176: variable 'ansible_shell_executable' from source: unknown 7530 1727096009.23179: variable 'ansible_connection' from source: unknown 7530 1727096009.23182: variable 'ansible_module_compression' from source: unknown 7530 1727096009.23185: variable 'ansible_shell_type' from source: unknown 7530 1727096009.23188: variable 'ansible_shell_executable' from source: unknown 7530 1727096009.23190: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096009.23193: variable 'ansible_pipelining' from source: unknown 7530 1727096009.23195: variable 'ansible_timeout' from source: unknown 7530 1727096009.23197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096009.23331: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096009.23339: variable 'omit' from source: magic vars 7530 1727096009.23344: starting attempt loop 7530 1727096009.23347: running the handler 7530 1727096009.23359: variable 'ansible_facts' from source: unknown 7530 1727096009.23377: _low_level_execute_command(): starting 7530 1727096009.23386: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096009.23970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096009.23974: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.24027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096009.24069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096009.24115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096009.25792: stdout chunk (state=3): >>>/root <<< 7530 1727096009.25914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096009.25921: stdout chunk (state=3): >>><<< 7530 1727096009.25930: stderr chunk (state=3): >>><<< 7530 1727096009.25948: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096009.25961: _low_level_execute_command(): starting 7530 1727096009.25971: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938 `" && echo ansible-tmp-1727096009.2594917-7545-41768837591938="` echo /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938 `" ) && sleep 0' 7530 1727096009.26436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096009.26440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.26443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096009.26445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.26493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096009.26500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096009.26502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096009.26539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096009.28497: stdout chunk (state=3): >>>ansible-tmp-1727096009.2594917-7545-41768837591938=/root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938 <<< 7530 1727096009.28595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096009.28629: stderr chunk (state=3): >>><<< 7530 1727096009.28633: stdout chunk (state=3): >>><<< 7530 1727096009.28649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096009.2594917-7545-41768837591938=/root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096009.28682: variable 'ansible_module_compression' from source: unknown 7530 1727096009.28732: ANSIBALLZ: Using generic lock for ansible.legacy.setup 7530 1727096009.28735: ANSIBALLZ: Acquiring lock 7530 1727096009.28738: ANSIBALLZ: Lock acquired: 139837168144544 7530 1727096009.28740: ANSIBALLZ: Creating module 7530 1727096009.47805: ANSIBALLZ: Writing module into payload 7530 1727096009.47899: ANSIBALLZ: Writing module 7530 1727096009.47918: ANSIBALLZ: Renaming module 7530 1727096009.47932: ANSIBALLZ: Done creating module 7530 1727096009.47955: variable 'ansible_facts' from source: unknown 7530 1727096009.47962: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096009.47971: _low_level_execute_command(): starting 7530 1727096009.47978: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 7530 1727096009.48413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096009.48420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.48432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.48492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096009.48497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096009.48542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096009.50238: stdout chunk (state=3): >>>PLATFORM <<< 7530 1727096009.50322: stdout chunk (state=3): >>>Linux <<< 7530 1727096009.50337: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 7530 1727096009.50577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096009.50581: stdout chunk (state=3): >>><<< 7530 1727096009.50584: stderr chunk (state=3): >>><<< 7530 1727096009.50587: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096009.50592 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 7530 1727096009.50621: _low_level_execute_command(): starting 7530 1727096009.50673: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 7530 1727096009.50845: Sending initial data 7530 1727096009.50848: Sent initial data (1181 bytes) 7530 1727096009.51328: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096009.51345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096009.51384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096009.51397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096009.51487: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.51514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096009.51536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096009.51560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096009.51635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096009.55197: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 7530 1727096009.55627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096009.55657: stderr chunk (state=3): >>><<< 7530 1727096009.55662: stdout chunk (state=3): >>><<< 7530 1727096009.55676: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096009.55736: variable 'ansible_facts' from source: unknown 7530 1727096009.55739: variable 'ansible_facts' from source: unknown 7530 1727096009.55751: variable 'ansible_module_compression' from source: unknown 7530 1727096009.55785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7530 1727096009.55809: variable 'ansible_facts' from source: unknown 7530 1727096009.55960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/AnsiballZ_setup.py 7530 1727096009.56196: Sending initial data 7530 1727096009.56200: Sent initial data (151 bytes) 7530 1727096009.56733: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096009.56750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096009.56781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.56877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096009.56891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096009.56912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096009.56930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096009.57019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096009.58676: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096009.58698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096009.58748: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpvns_2xfo /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/AnsiballZ_setup.py <<< 7530 1727096009.58752: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/AnsiballZ_setup.py" <<< 7530 1727096009.58766: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpvns_2xfo" to remote "/root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/AnsiballZ_setup.py" <<< 7530 1727096009.60195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096009.60199: stdout chunk (state=3): >>><<< 7530 1727096009.60202: stderr chunk (state=3): >>><<< 7530 1727096009.60228: done transferring module to remote 7530 1727096009.60333: _low_level_execute_command(): starting 7530 1727096009.60336: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/ /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/AnsiballZ_setup.py && sleep 0' 7530 1727096009.61271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096009.61290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096009.61305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096009.61326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096009.61370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096009.61455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096009.61485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096009.61551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096009.63469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096009.63505: stderr chunk (state=3): >>><<< 7530 1727096009.63509: stdout chunk (state=3): >>><<< 7530 1727096009.63574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096009.63578: _low_level_execute_command(): starting 7530 1727096009.63582: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/AnsiballZ_setup.py && sleep 0' 7530 1727096009.64229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096009.64244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096009.64260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096009.64339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096009.64385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096009.64405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096009.64446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096009.64524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096009.66758: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 7530 1727096009.66789: stdout chunk (state=3): >>>import _imp # builtin <<< 7530 1727096009.66826: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 7530 1727096009.66879: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7530 1727096009.66915: stdout chunk (state=3): >>>import 'posix' # <<< 7530 1727096009.66949: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 7530 1727096009.66999: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 7530 1727096009.67009: stdout chunk (state=3): >>># installed zipimport hook <<< 7530 1727096009.67047: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.67071: stdout chunk (state=3): >>>import '_codecs' # <<< 7530 1727096009.67092: stdout chunk (state=3): >>>import 'codecs' # <<< 7530 1727096009.67126: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 7530 1727096009.67160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 7530 1727096009.67163: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83c184d0> <<< 7530 1727096009.67196: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83be7b30> <<< 7530 1727096009.67199: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 7530 1727096009.67229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83c1aa50> <<< 7530 1727096009.67231: stdout chunk (state=3): >>>import '_signal' # <<< 7530 1727096009.67258: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 7530 1727096009.67281: stdout chunk (state=3): >>>import 'io' # <<< 7530 1727096009.67320: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 7530 1727096009.67421: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7530 1727096009.67435: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 7530 1727096009.67466: stdout chunk (state=3): >>>import 'os' # <<< 7530 1727096009.67509: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 7530 1727096009.67537: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 7530 1727096009.67554: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 7530 1727096009.67581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a2d130> <<< 7530 1727096009.67639: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 7530 1727096009.67664: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a2dfa0> <<< 7530 1727096009.67678: stdout chunk (state=3): >>>import 'site' # <<< 7530 1727096009.67711: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7530 1727096009.68106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 7530 1727096009.68109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7530 1727096009.68139: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.68154: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 7530 1727096009.68213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 7530 1727096009.68216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7530 1727096009.68246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 7530 1727096009.68278: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a6be00> <<< 7530 1727096009.68281: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 7530 1727096009.68333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # <<< 7530 1727096009.68342: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a6bec0> <<< 7530 1727096009.68352: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 7530 1727096009.68362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 7530 1727096009.68386: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7530 1727096009.68441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.68459: stdout chunk (state=3): >>>import 'itertools' # <<< 7530 1727096009.68728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83aa37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83aa3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a83ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a811f0> <<< 7530 1727096009.68731: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a68fb0> <<< 7530 1727096009.68734: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7530 1727096009.68753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 7530 1727096009.68775: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 7530 1727096009.68798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 7530 1727096009.68829: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7530 1727096009.68853: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83ac3770> <<< 7530 1727096009.68872: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83ac2390> <<< 7530 1727096009.68901: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83ac0bc0> <<< 7530 1727096009.68954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 7530 1727096009.68976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a68230> <<< 7530 1727096009.69038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83af8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af8b60> <<< 7530 1727096009.69073: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83af8ef0> <<< 7530 1727096009.69152: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a66d50> <<< 7530 1727096009.69175: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af9580> <<< 7530 1727096009.69252: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af9250> import 'importlib.machinery' # <<< 7530 1727096009.69359: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83afa480> <<< 7530 1727096009.69362: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 7530 1727096009.69403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 7530 1727096009.69406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83b11d90> <<< 7530 1727096009.69427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 7530 1727096009.69470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 7530 1727096009.69476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 7530 1727096009.69504: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b12c30> <<< 7530 1727096009.69515: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83b13290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b12180> <<< 7530 1727096009.69578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 7530 1727096009.69601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83b13d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b13440> <<< 7530 1727096009.69645: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83afa4e0> <<< 7530 1727096009.69665: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 7530 1727096009.69714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 7530 1727096009.69725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 7530 1727096009.69815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83803bc0> <<< 7530 1727096009.69903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 7530 1727096009.69907: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382c410> <<< 7530 1727096009.69941: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382c6e0> <<< 7530 1727096009.69944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 7530 1727096009.69963: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.70078: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382d010> <<< 7530 1727096009.70254: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.70338: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83801d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7530 1727096009.70349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 7530 1727096009.70370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83afabd0> <<< 7530 1727096009.70396: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7530 1727096009.70480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.70483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 7530 1727096009.70508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 7530 1727096009.70535: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8385b140> <<< 7530 1727096009.70625: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 7530 1727096009.70629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.70772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7530 1727096009.70778: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8387b500> <<< 7530 1727096009.70799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7530 1727096009.70819: stdout chunk (state=3): >>>import 'ntpath' # <<< 7530 1727096009.70845: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838dc260> <<< 7530 1727096009.70867: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 7530 1727096009.70900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 7530 1727096009.70922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 7530 1727096009.71009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7530 1727096009.71052: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838de9c0> <<< 7530 1727096009.71128: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838dc380> <<< 7530 1727096009.71230: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838a5280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d836dd370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8387a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382fd40> <<< 7530 1727096009.71398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 7530 1727096009.71426: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7d8387a420> <<< 7530 1727096009.71713: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_5cpqtdle/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 7530 1727096009.71833: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.71882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 7530 1727096009.71885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7530 1727096009.71916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7530 1727096009.71997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 7530 1727096009.72040: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83742ff0> <<< 7530 1727096009.72043: stdout chunk (state=3): >>>import '_typing' # <<< 7530 1727096009.72310: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83721ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83721040> <<< 7530 1727096009.72313: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.72334: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 7530 1727096009.72346: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.73757: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.74978: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83740ec0> <<< 7530 1727096009.75029: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.75065: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 7530 1727096009.75073: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py<<< 7530 1727096009.75096: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83776900> <<< 7530 1727096009.75162: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83776690> <<< 7530 1727096009.75187: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83775fa0> <<< 7530 1727096009.75207: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 7530 1727096009.75249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7530 1727096009.75287: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d837763f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83743c80> <<< 7530 1727096009.75291: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.75348: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d837776b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.75353: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d837778f0> <<< 7530 1727096009.75372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 7530 1727096009.75419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 7530 1727096009.75431: stdout chunk (state=3): >>>import '_locale' # <<< 7530 1727096009.75490: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83777e30> <<< 7530 1727096009.75505: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 7530 1727096009.75538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 7530 1727096009.75580: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83125bb0> <<< 7530 1727096009.75614: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d831277d0> <<< 7530 1727096009.75635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 7530 1727096009.75653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 7530 1727096009.75693: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83128170> <<< 7530 1727096009.75719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 7530 1727096009.75746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7530 1727096009.75769: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83129310> <<< 7530 1727096009.75791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 7530 1727096009.75845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7530 1727096009.75858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7530 1727096009.75948: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8312bda0> <<< 7530 1727096009.75963: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d837230e0> <<< 7530 1727096009.75990: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8312a060> <<< 7530 1727096009.76026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 7530 1727096009.76037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 7530 1727096009.76083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 7530 1727096009.76091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7530 1727096009.76199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 7530 1727096009.76225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 7530 1727096009.76234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 7530 1727096009.76248: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83133d40> <<< 7530 1727096009.76258: stdout chunk (state=3): >>>import '_tokenize' # <<< 7530 1727096009.76328: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83132810> <<< 7530 1727096009.76332: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83132570> <<< 7530 1727096009.76355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 7530 1727096009.76366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 7530 1727096009.76434: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83132ae0> <<< 7530 1727096009.76466: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8312a570> <<< 7530 1727096009.76499: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83177f80> <<< 7530 1727096009.76531: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.76536: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831780e0> <<< 7530 1727096009.76553: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 7530 1727096009.76577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 7530 1727096009.76603: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7530 1727096009.76647: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.76652: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83179bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83179970> <<< 7530 1727096009.76671: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7530 1727096009.76706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7530 1727096009.76759: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8317c140> <<< 7530 1727096009.76764: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8317a2a0> <<< 7530 1727096009.76786: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 7530 1727096009.76834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.76854: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 7530 1727096009.76873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 7530 1727096009.76877: stdout chunk (state=3): >>>import '_string' # <<< 7530 1727096009.76923: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8317f860> <<< 7530 1727096009.77043: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8317c230> <<< 7530 1727096009.77109: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83180b90> <<< 7530 1727096009.77141: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.77147: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83180710> <<< 7530 1727096009.77185: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.77193: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83180230> <<< 7530 1727096009.77202: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831782f0> <<< 7530 1727096009.77226: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 7530 1727096009.77254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 7530 1727096009.77277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 7530 1727096009.77308: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.77333: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8300c110> <<< 7530 1727096009.77492: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.77496: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8300d040> <<< 7530 1727096009.77505: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831828a0> <<< 7530 1727096009.77543: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83183c50> <<< 7530 1727096009.77551: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83182540> <<< 7530 1727096009.77575: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.77586: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 7530 1727096009.77604: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.77694: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.77786: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.77791: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.77811: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 7530 1727096009.77823: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.77835: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 7530 1727096009.77856: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.77994: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.78100: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.78679: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.79256: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 7530 1727096009.79265: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 7530 1727096009.79290: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 7530 1727096009.79310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.79370: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83011220> <<< 7530 1727096009.79455: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 7530 1727096009.79481: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83011fd0> <<< 7530 1727096009.79486: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8300d1f0> <<< 7530 1727096009.79541: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 7530 1727096009.79552: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.79579: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.79587: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 7530 1727096009.79602: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.79753: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.79908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 7530 1727096009.79922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 7530 1727096009.79929: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83012210> <<< 7530 1727096009.79942: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.80416: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.80871: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.80947: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81017: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 7530 1727096009.81036: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81073: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81110: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 7530 1727096009.81113: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81187: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81268: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 7530 1727096009.81296: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096009.81301: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 7530 1727096009.81318: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81360: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81397: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 7530 1727096009.81413: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81641: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.81881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7530 1727096009.81945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 7530 1727096009.81959: stdout chunk (state=3): >>>import '_ast' # <<< 7530 1727096009.82036: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830132c0> <<< 7530 1727096009.82045: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82115: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82192: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 7530 1727096009.82201: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 7530 1727096009.82209: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 7530 1727096009.82236: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82276: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82320: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 7530 1727096009.82322: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82374: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82423: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82478: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82544: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 7530 1727096009.82593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.82684: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8301de80> <<< 7530 1727096009.82722: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8301bf50> <<< 7530 1727096009.82756: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 7530 1727096009.82763: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 7530 1727096009.82772: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82837: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82897: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82926: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.82980: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.83007: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 7530 1727096009.83036: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 7530 1727096009.83042: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 7530 1727096009.83116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 7530 1727096009.83132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 7530 1727096009.83146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 7530 1727096009.83203: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83106630> <<< 7530 1727096009.83246: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831fe300> <<< 7530 1727096009.83329: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83012f60> <<< 7530 1727096009.83346: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83180d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 7530 1727096009.83349: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83380: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83412: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 7530 1727096009.83472: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 7530 1727096009.83477: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83497: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 7530 1727096009.83518: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83577: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83640: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83659: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83685: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83724: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83769: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83804: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83844: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 7530 1727096009.83855: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.83935: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.84005: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.84032: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.84063: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 7530 1727096009.84079: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.84268: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.84433: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.84476: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.84538: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096009.84557: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 7530 1727096009.84580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 7530 1727096009.84594: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 7530 1727096009.84625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 7530 1727096009.84649: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b1a90> <<< 7530 1727096009.84671: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 7530 1727096009.84682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 7530 1727096009.84705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 7530 1727096009.84751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 7530 1727096009.84780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 7530 1727096009.84784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 7530 1727096009.84809: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c2bc80> <<< 7530 1727096009.84853: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.84876: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c2bfe0> <<< 7530 1727096009.84932: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8309a660> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b25d0> <<< 7530 1727096009.84974: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b0200> <<< 7530 1727096009.85008: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b3b90> <<< 7530 1727096009.85011: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 7530 1727096009.85132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 7530 1727096009.85135: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 7530 1727096009.85138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 7530 1727096009.85198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c46f30> <<< 7530 1727096009.85201: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c467e0> <<< 7530 1727096009.85226: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c469c0> <<< 7530 1727096009.85248: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c45c10> <<< 7530 1727096009.85273: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 7530 1727096009.85444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c470e0> <<< 7530 1727096009.85448: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 7530 1727096009.85495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 7530 1727096009.85499: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c9dbb0> <<< 7530 1727096009.85525: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c47bc0> <<< 7530 1727096009.85563: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b3920> <<< 7530 1727096009.85602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 7530 1727096009.85646: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 7530 1727096009.85698: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.85702: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.85786: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 7530 1727096009.85790: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.85828: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.85891: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 7530 1727096009.85923: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096009.85981: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 7530 1727096009.85984: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.86001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 7530 1727096009.86048: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.86115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 7530 1727096009.86159: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.86215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 7530 1727096009.86270: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.86337: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.86388: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.86455: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 7530 1727096009.86466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 7530 1727096009.86990: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 7530 1727096009.87440: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87485: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87539: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87576: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87612: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 7530 1727096009.87619: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 7530 1727096009.87628: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87655: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 7530 1727096009.87698: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87750: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 7530 1727096009.87823: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87855: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 7530 1727096009.87894: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87925: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.87957: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 7530 1727096009.87961: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88050: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 7530 1727096009.88145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 7530 1727096009.88175: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c9f860> <<< 7530 1727096009.88198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 7530 1727096009.88229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 7530 1727096009.88344: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c9e780> import 'ansible.module_utils.facts.system.local' # <<< 7530 1727096009.88365: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88434: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88498: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 7530 1727096009.88515: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88604: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 7530 1727096009.88712: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88779: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88852: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 7530 1727096009.88855: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88904: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.88952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 7530 1727096009.89012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 7530 1727096009.89095: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.89153: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82ccdfa0> <<< 7530 1727096009.89362: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82cbede0> import 'ansible.module_utils.facts.system.python' # <<< 7530 1727096009.89378: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.89432: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.89491: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 7530 1727096009.89501: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.89589: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.89664: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.89787: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.89962: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 7530 1727096009.89966: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096009.89992: stdout chunk (state=3): >>> <<< 7530 1727096009.89997: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 7530 1727096009.90044: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90093: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 7530 1727096009.90160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 7530 1727096009.90201: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096009.90238: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82ce1be0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82cbef90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 7530 1727096009.90259: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 7530 1727096009.90310: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 7530 1727096009.90361: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90513: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 7530 1727096009.90681: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90777: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90880: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90920: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.90976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 7530 1727096009.91020: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.91023: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.91036: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.91173: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.91326: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 7530 1727096009.91458: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.91583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 7530 1727096009.91595: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.91619: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.91658: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.92256: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.92801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 7530 1727096009.92908: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 7530 1727096009.93031: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93125: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 7530 1727096009.93239: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93393: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93561: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 7530 1727096009.93564: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93597: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 7530 1727096009.93637: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 7530 1727096009.93707: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93790: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.93889: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94095: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 7530 1727096009.94364: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 7530 1727096009.94421: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94436: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94472: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 7530 1727096009.94488: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94539: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94614: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 7530 1727096009.94625: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94666: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94672: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 7530 1727096009.94695: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94740: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94813: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 7530 1727096009.94816: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94866: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.94940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 7530 1727096009.94943: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95215: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 7530 1727096009.95509: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95553: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 7530 1727096009.95660: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 7530 1727096009.95712: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95741: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 7530 1727096009.95794: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95818: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 7530 1727096009.95869: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.95955: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 7530 1727096009.96057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096009.96086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 7530 1727096009.96117: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 7530 1727096009.96189: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96210: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96223: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96269: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96318: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96389: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96470: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 7530 1727096009.96490: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 7530 1727096009.96536: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 7530 1727096009.96597: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.96801: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97010: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 7530 1727096009.97013: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97054: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 7530 1727096009.97113: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97157: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 7530 1727096009.97228: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97300: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 7530 1727096009.97403: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97487: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97579: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 7530 1727096009.97665: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096009.97884: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 7530 1727096009.97916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 7530 1727096009.97930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 7530 1727096009.97974: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82a7b1a0> <<< 7530 1727096009.97992: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82a7b080> <<< 7530 1727096009.98028: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82a7a030> <<< 7530 1727096010.08902: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 7530 1727096010.08907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 7530 1727096010.08942: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82a7bd70> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 7530 1727096010.08963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 7530 1727096010.08966: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82ac0770> <<< 7530 1727096010.09017: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 7530 1727096010.09036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.09083: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82ac1f40> <<< 7530 1727096010.09087: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82ac1970> <<< 7530 1727096010.09348: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 7530 1727096010.09351: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 7530 1727096010.34206: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3054, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 477, "free": 3054}, "nocache": {"free": 3328, "used": 203}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 153, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261816197120, "block_size": 4096, "block_total": 65519099, "block_available": 63919970, "block_used": 1599129, "inode_total": 131070960, "inode_available": 131029232, "inode_used": 41728, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansib<<< 7530 1727096010.34266: stdout chunk (state=3): >>>le_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "53", "second": "30", "epoch": "1727096010", "epoch_int": "1727096010", "date": "2024-09-23", "time": "08:53:30", "iso8601_micro": "2024-09-23T12:53:30.300750Z", "iso8601": "2024-09-23T12:53:30Z", "iso8601_basic": "20240923T085330300750", "iso8601_basic_short": "20240923T085330", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4296875, "5m": 0.435546875, "15m": 0.181640625}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7530 1727096010.35055: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy <<< 7530 1727096010.35228: stdout chunk (state=3): >>># cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips <<< 7530 1727096010.35292: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors <<< 7530 1727096010.35296: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace <<< 7530 1727096010.35364: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd <<< 7530 1727096010.35370: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 7530 1727096010.35829: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 7530 1727096010.35832: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 7530 1727096010.35836: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 7530 1727096010.35977: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 7530 1727096010.36019: stdout chunk (state=3): >>># destroy ntpath <<< 7530 1727096010.36022: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 7530 1727096010.36198: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 7530 1727096010.36203: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 7530 1727096010.36271: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 7530 1727096010.36296: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 7530 1727096010.36370: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 7530 1727096010.36404: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 7530 1727096010.36506: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 7530 1727096010.36587: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 7530 1727096010.36634: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 7530 1727096010.36672: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7530 1727096010.36820: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 7530 1727096010.36826: stdout chunk (state=3): >>># destroy _collections <<< 7530 1727096010.36855: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 7530 1727096010.36858: stdout chunk (state=3): >>># destroy tokenize <<< 7530 1727096010.36921: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 7530 1727096010.36925: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 7530 1727096010.36946: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 7530 1727096010.36969: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 7530 1727096010.37059: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 7530 1727096010.37077: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 7530 1727096010.37117: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 7530 1727096010.37175: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 7530 1727096010.37178: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 7530 1727096010.37195: stdout chunk (state=3): >>># clear sys.audit hooks <<< 7530 1727096010.37622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096010.37626: stdout chunk (state=3): >>><<< 7530 1727096010.37628: stderr chunk (state=3): >>><<< 7530 1727096010.37886: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a6be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a6bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83aa37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83aa3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a83ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a811f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a68fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83ac3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83ac2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83ac0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a68230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83af8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83af8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83a66d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83af9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83afa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83b11d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b12c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83b13290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b12180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83b13d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83b13440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83afa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83803bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8382da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83801d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83afabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8385b140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8387b500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838dc260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838de9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838dc380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d838a5280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d836dd370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8387a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8382fd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7d8387a420> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_5cpqtdle/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83742ff0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83721ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83721040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83740ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83776900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83776690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83775fa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d837763f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83743c80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d837776b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d837778f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83777e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83125bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d831277d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83128170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83129310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8312bda0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d837230e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8312a060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83133d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83132810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83132570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83132ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8312a570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83177f80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831780e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83179bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83179970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8317c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8317a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8317f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8317c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83180b90> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83180710> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83180230> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831782f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8300c110> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8300d040> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831828a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83183c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83182540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d83011220> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83011fd0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8300d1f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83012210> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830132c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d8301de80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8301bf50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83106630> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d831fe300> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83012f60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d83180d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b1a90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c2bc80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c2bfe0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d8309a660> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b25d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b0200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b3b90> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c46f30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c467e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c469c0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c45c10> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c470e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82c9dbb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c47bc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d830b3920> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c9f860> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82c9e780> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82ccdfa0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82cbede0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82ce1be0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82cbef90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d82a7b1a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82a7b080> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82a7a030> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82a7bd70> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82ac0770> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82ac1f40> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d82ac1970> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3054, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 477, "free": 3054}, "nocache": {"free": 3328, "used": 203}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 153, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261816197120, "block_size": 4096, "block_total": 65519099, "block_available": 63919970, "block_used": 1599129, "inode_total": 131070960, "inode_available": 131029232, "inode_used": 41728, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "53", "second": "30", "epoch": "1727096010", "epoch_int": "1727096010", "date": "2024-09-23", "time": "08:53:30", "iso8601_micro": "2024-09-23T12:53:30.300750Z", "iso8601": "2024-09-23T12:53:30Z", "iso8601_basic": "20240923T085330300750", "iso8601_basic_short": "20240923T085330", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4296875, "5m": 0.435546875, "15m": 0.181640625}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 7530 1727096010.39152: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096010.39156: _low_level_execute_command(): starting 7530 1727096010.39158: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096009.2594917-7545-41768837591938/ > /dev/null 2>&1 && sleep 0' 7530 1727096010.39677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096010.39681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096010.39688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096010.39691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096010.39694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096010.39696: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096010.39699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096010.39701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096010.39704: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096010.39706: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096010.39714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096010.39727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096010.39739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096010.39747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096010.39784: stderr chunk (state=3): >>>debug2: match found <<< 7530 1727096010.39787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096010.39833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096010.39845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096010.39871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096010.39934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096010.41874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096010.41878: stdout chunk (state=3): >>><<< 7530 1727096010.41894: stderr chunk (state=3): >>><<< 7530 1727096010.42074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096010.42078: handler run complete 7530 1727096010.42081: variable 'ansible_facts' from source: unknown 7530 1727096010.42158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096010.42534: variable 'ansible_facts' from source: unknown 7530 1727096010.42638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096010.42784: attempt loop complete, returning result 7530 1727096010.42794: _execute() done 7530 1727096010.42801: dumping result to json 7530 1727096010.42838: done dumping result, returning 7530 1727096010.42861: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-086b-f4f0-000000000155] 7530 1727096010.42873: sending task result for task 0afff68d-5257-086b-f4f0-000000000155 7530 1727096010.43341: done sending task result for task 0afff68d-5257-086b-f4f0-000000000155 7530 1727096010.43344: WORKER PROCESS EXITING ok: [managed_node3] 7530 1727096010.43958: no more pending results, returning what we have 7530 1727096010.43962: results queue empty 7530 1727096010.43963: checking for any_errors_fatal 7530 1727096010.43964: done checking for any_errors_fatal 7530 1727096010.43965: checking for max_fail_percentage 7530 1727096010.43967: done checking for max_fail_percentage 7530 1727096010.43970: checking to see if all hosts have failed and the running result is not ok 7530 1727096010.43970: done checking to see if all hosts have failed 7530 1727096010.43971: getting the remaining hosts for this loop 7530 1727096010.43973: done getting the remaining hosts for this loop 7530 1727096010.43978: getting the next task for host managed_node3 7530 1727096010.43985: done getting next task for host managed_node3 7530 1727096010.43993: ^ task is: TASK: meta (flush_handlers) 7530 1727096010.43995: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096010.44000: getting variables 7530 1727096010.44002: in VariableManager get_vars() 7530 1727096010.44029: Calling all_inventory to load vars for managed_node3 7530 1727096010.44032: Calling groups_inventory to load vars for managed_node3 7530 1727096010.44035: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096010.44046: Calling all_plugins_play to load vars for managed_node3 7530 1727096010.44049: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096010.44052: Calling groups_plugins_play to load vars for managed_node3 7530 1727096010.44258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096010.44484: done with get_vars() 7530 1727096010.44497: done getting variables 7530 1727096010.44583: in VariableManager get_vars() 7530 1727096010.44595: Calling all_inventory to load vars for managed_node3 7530 1727096010.44598: Calling groups_inventory to load vars for managed_node3 7530 1727096010.44600: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096010.44605: Calling all_plugins_play to load vars for managed_node3 7530 1727096010.44614: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096010.44619: Calling groups_plugins_play to load vars for managed_node3 7530 1727096010.44784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096010.44998: done with get_vars() 7530 1727096010.45011: done queuing things up, now waiting for results queue to drain 7530 1727096010.45013: results queue empty 7530 1727096010.45014: checking for any_errors_fatal 7530 1727096010.45019: done checking for any_errors_fatal 7530 1727096010.45019: checking for max_fail_percentage 7530 1727096010.45020: done checking for max_fail_percentage 7530 1727096010.45021: checking to see if all hosts have failed and the running result is not ok 7530 1727096010.45022: done checking to see if all hosts have failed 7530 1727096010.45022: getting the remaining hosts for this loop 7530 1727096010.45023: done getting the remaining hosts for this loop 7530 1727096010.45026: getting the next task for host managed_node3 7530 1727096010.45030: done getting next task for host managed_node3 7530 1727096010.45032: ^ task is: TASK: Include the task 'el_repo_setup.yml' 7530 1727096010.45034: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096010.45036: getting variables 7530 1727096010.45037: in VariableManager get_vars() 7530 1727096010.45044: Calling all_inventory to load vars for managed_node3 7530 1727096010.45046: Calling groups_inventory to load vars for managed_node3 7530 1727096010.45048: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096010.45052: Calling all_plugins_play to load vars for managed_node3 7530 1727096010.45054: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096010.45056: Calling groups_plugins_play to load vars for managed_node3 7530 1727096010.45193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096010.45357: done with get_vars() 7530 1727096010.45364: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:11 Monday 23 September 2024 08:53:30 -0400 (0:00:01.232) 0:00:01.242 ****** 7530 1727096010.45449: entering _queue_task() for managed_node3/include_tasks 7530 1727096010.45452: Creating lock for include_tasks 7530 1727096010.45904: worker is 1 (out of 1 available) 7530 1727096010.45915: exiting _queue_task() for managed_node3/include_tasks 7530 1727096010.45927: done queuing things up, now waiting for results queue to drain 7530 1727096010.45929: waiting for pending results... 7530 1727096010.46192: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 7530 1727096010.46198: in run() - task 0afff68d-5257-086b-f4f0-000000000006 7530 1727096010.46201: variable 'ansible_search_path' from source: unknown 7530 1727096010.46210: calling self._execute() 7530 1727096010.46295: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096010.46307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096010.46323: variable 'omit' from source: magic vars 7530 1727096010.46436: _execute() done 7530 1727096010.46504: dumping result to json 7530 1727096010.46508: done dumping result, returning 7530 1727096010.46511: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-086b-f4f0-000000000006] 7530 1727096010.46514: sending task result for task 0afff68d-5257-086b-f4f0-000000000006 7530 1727096010.46715: no more pending results, returning what we have 7530 1727096010.46723: in VariableManager get_vars() 7530 1727096010.46760: Calling all_inventory to load vars for managed_node3 7530 1727096010.46763: Calling groups_inventory to load vars for managed_node3 7530 1727096010.46775: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096010.46790: Calling all_plugins_play to load vars for managed_node3 7530 1727096010.46793: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096010.46796: Calling groups_plugins_play to load vars for managed_node3 7530 1727096010.47170: done sending task result for task 0afff68d-5257-086b-f4f0-000000000006 7530 1727096010.47173: WORKER PROCESS EXITING 7530 1727096010.47191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096010.47390: done with get_vars() 7530 1727096010.47397: variable 'ansible_search_path' from source: unknown 7530 1727096010.47410: we have included files to process 7530 1727096010.47411: generating all_blocks data 7530 1727096010.47412: done generating all_blocks data 7530 1727096010.47413: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7530 1727096010.47414: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7530 1727096010.47416: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7530 1727096010.48110: in VariableManager get_vars() 7530 1727096010.48129: done with get_vars() 7530 1727096010.48146: done processing included file 7530 1727096010.48148: iterating over new_blocks loaded from include file 7530 1727096010.48149: in VariableManager get_vars() 7530 1727096010.48160: done with get_vars() 7530 1727096010.48161: filtering new block on tags 7530 1727096010.48181: done filtering new block on tags 7530 1727096010.48184: in VariableManager get_vars() 7530 1727096010.48196: done with get_vars() 7530 1727096010.48198: filtering new block on tags 7530 1727096010.48213: done filtering new block on tags 7530 1727096010.48216: in VariableManager get_vars() 7530 1727096010.48228: done with get_vars() 7530 1727096010.48229: filtering new block on tags 7530 1727096010.48243: done filtering new block on tags 7530 1727096010.48245: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 7530 1727096010.48256: extending task lists for all hosts with included blocks 7530 1727096010.48311: done extending task lists 7530 1727096010.48312: done processing included files 7530 1727096010.48313: results queue empty 7530 1727096010.48314: checking for any_errors_fatal 7530 1727096010.48315: done checking for any_errors_fatal 7530 1727096010.48316: checking for max_fail_percentage 7530 1727096010.48320: done checking for max_fail_percentage 7530 1727096010.48320: checking to see if all hosts have failed and the running result is not ok 7530 1727096010.48321: done checking to see if all hosts have failed 7530 1727096010.48322: getting the remaining hosts for this loop 7530 1727096010.48323: done getting the remaining hosts for this loop 7530 1727096010.48325: getting the next task for host managed_node3 7530 1727096010.48329: done getting next task for host managed_node3 7530 1727096010.48331: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 7530 1727096010.48334: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096010.48336: getting variables 7530 1727096010.48337: in VariableManager get_vars() 7530 1727096010.48345: Calling all_inventory to load vars for managed_node3 7530 1727096010.48348: Calling groups_inventory to load vars for managed_node3 7530 1727096010.48350: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096010.48360: Calling all_plugins_play to load vars for managed_node3 7530 1727096010.48363: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096010.48366: Calling groups_plugins_play to load vars for managed_node3 7530 1727096010.48543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096010.48747: done with get_vars() 7530 1727096010.48755: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:53:30 -0400 (0:00:00.033) 0:00:01.276 ****** 7530 1727096010.48835: entering _queue_task() for managed_node3/setup 7530 1727096010.49187: worker is 1 (out of 1 available) 7530 1727096010.49200: exiting _queue_task() for managed_node3/setup 7530 1727096010.49213: done queuing things up, now waiting for results queue to drain 7530 1727096010.49214: waiting for pending results... 7530 1727096010.49500: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 7530 1727096010.49631: in run() - task 0afff68d-5257-086b-f4f0-000000000166 7530 1727096010.49651: variable 'ansible_search_path' from source: unknown 7530 1727096010.49659: variable 'ansible_search_path' from source: unknown 7530 1727096010.49727: calling self._execute() 7530 1727096010.49811: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096010.49826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096010.49893: variable 'omit' from source: magic vars 7530 1727096010.50422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096010.52807: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096010.52892: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096010.52942: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096010.52987: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096010.53045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096010.53111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096010.53154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096010.53186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096010.53261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096010.53265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096010.53450: variable 'ansible_facts' from source: unknown 7530 1727096010.53537: variable 'network_test_required_facts' from source: task vars 7530 1727096010.53634: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 7530 1727096010.53638: variable 'omit' from source: magic vars 7530 1727096010.53643: variable 'omit' from source: magic vars 7530 1727096010.53688: variable 'omit' from source: magic vars 7530 1727096010.53728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096010.53762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096010.53788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096010.53851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096010.53855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096010.53871: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096010.53879: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096010.53887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096010.53992: Set connection var ansible_pipelining to False 7530 1727096010.54003: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096010.54012: Set connection var ansible_timeout to 10 7530 1727096010.54065: Set connection var ansible_shell_executable to /bin/sh 7530 1727096010.54070: Set connection var ansible_shell_type to sh 7530 1727096010.54072: Set connection var ansible_connection to ssh 7530 1727096010.54074: variable 'ansible_shell_executable' from source: unknown 7530 1727096010.54076: variable 'ansible_connection' from source: unknown 7530 1727096010.54085: variable 'ansible_module_compression' from source: unknown 7530 1727096010.54174: variable 'ansible_shell_type' from source: unknown 7530 1727096010.54177: variable 'ansible_shell_executable' from source: unknown 7530 1727096010.54179: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096010.54182: variable 'ansible_pipelining' from source: unknown 7530 1727096010.54186: variable 'ansible_timeout' from source: unknown 7530 1727096010.54188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096010.54441: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096010.54461: variable 'omit' from source: magic vars 7530 1727096010.54472: starting attempt loop 7530 1727096010.54479: running the handler 7530 1727096010.54496: _low_level_execute_command(): starting 7530 1727096010.54509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096010.55295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096010.55336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096010.55356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096010.55459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096010.57133: stdout chunk (state=3): >>>/root <<< 7530 1727096010.57279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096010.57373: stdout chunk (state=3): >>><<< 7530 1727096010.57378: stderr chunk (state=3): >>><<< 7530 1727096010.57381: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096010.57393: _low_level_execute_command(): starting 7530 1727096010.57396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905 `" && echo ansible-tmp-1727096010.5732653-7589-112236536380905="` echo /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905 `" ) && sleep 0' 7530 1727096010.58350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096010.58364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096010.58383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096010.58404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096010.58426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096010.58511: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096010.58537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096010.58551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096010.58581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096010.58775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096010.60770: stdout chunk (state=3): >>>ansible-tmp-1727096010.5732653-7589-112236536380905=/root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905 <<< 7530 1727096010.60857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096010.60907: stderr chunk (state=3): >>><<< 7530 1727096010.60910: stdout chunk (state=3): >>><<< 7530 1727096010.61100: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096010.5732653-7589-112236536380905=/root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096010.61103: variable 'ansible_module_compression' from source: unknown 7530 1727096010.61105: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7530 1727096010.61156: variable 'ansible_facts' from source: unknown 7530 1727096010.61377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/AnsiballZ_setup.py 7530 1727096010.61623: Sending initial data 7530 1727096010.61626: Sent initial data (152 bytes) 7530 1727096010.62356: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096010.62495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096010.62528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096010.64203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096010.64279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096010.64355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpjhz4ijyc /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/AnsiballZ_setup.py <<< 7530 1727096010.64359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/AnsiballZ_setup.py" <<< 7530 1727096010.64422: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpjhz4ijyc" to remote "/root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/AnsiballZ_setup.py" <<< 7530 1727096010.66595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096010.66735: stderr chunk (state=3): >>><<< 7530 1727096010.66738: stdout chunk (state=3): >>><<< 7530 1727096010.66741: done transferring module to remote 7530 1727096010.66743: _low_level_execute_command(): starting 7530 1727096010.66745: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/ /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/AnsiballZ_setup.py && sleep 0' 7530 1727096010.68089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096010.68202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096010.68391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096010.68480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096010.70358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096010.70363: stdout chunk (state=3): >>><<< 7530 1727096010.70365: stderr chunk (state=3): >>><<< 7530 1727096010.70386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096010.70395: _low_level_execute_command(): starting 7530 1727096010.70405: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/AnsiballZ_setup.py && sleep 0' 7530 1727096010.71876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096010.71990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096010.72176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096010.72385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096010.75165: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 7530 1727096010.75218: stdout chunk (state=3): >>>import '_io' # <<< 7530 1727096010.75223: stdout chunk (state=3): >>> <<< 7530 1727096010.75225: stdout chunk (state=3): >>>import 'marshal' # <<< 7530 1727096010.75287: stdout chunk (state=3): >>>import 'posix' # <<< 7530 1727096010.75291: stdout chunk (state=3): >>> <<< 7530 1727096010.75336: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7530 1727096010.75351: stdout chunk (state=3): >>> <<< 7530 1727096010.75358: stdout chunk (state=3): >>># installing zipimport hook <<< 7530 1727096010.75396: stdout chunk (state=3): >>>import 'time' # <<< 7530 1727096010.75401: stdout chunk (state=3): >>> <<< 7530 1727096010.75419: stdout chunk (state=3): >>>import 'zipimport' # <<< 7530 1727096010.75423: stdout chunk (state=3): >>> # installed zipimport hook<<< 7530 1727096010.75435: stdout chunk (state=3): >>> <<< 7530 1727096010.75502: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 7530 1727096010.75505: stdout chunk (state=3): >>> <<< 7530 1727096010.75547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 7530 1727096010.75550: stdout chunk (state=3): >>> <<< 7530 1727096010.75588: stdout chunk (state=3): >>>import 'codecs' # <<< 7530 1727096010.75591: stdout chunk (state=3): >>> <<< 7530 1727096010.75669: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 7530 1727096010.75680: stdout chunk (state=3): >>> <<< 7530 1727096010.75690: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e5104d0><<< 7530 1727096010.75709: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e4dfb30><<< 7530 1727096010.75713: stdout chunk (state=3): >>> <<< 7530 1727096010.75764: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e512a50> <<< 7530 1727096010.75824: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 7530 1727096010.75842: stdout chunk (state=3): >>>import 'abc' # <<< 7530 1727096010.75876: stdout chunk (state=3): >>> import 'io' # <<< 7530 1727096010.75880: stdout chunk (state=3): >>> <<< 7530 1727096010.75926: stdout chunk (state=3): >>>import '_stat' # <<< 7530 1727096010.75934: stdout chunk (state=3): >>> <<< 7530 1727096010.76065: stdout chunk (state=3): >>>import 'stat' # <<< 7530 1727096010.76069: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7530 1727096010.76126: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 7530 1727096010.76184: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 7530 1727096010.76187: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 7530 1727096010.76239: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 7530 1727096010.76242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 7530 1727096010.76265: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e305130> <<< 7530 1727096010.76312: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 7530 1727096010.76329: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e305fa0> <<< 7530 1727096010.76347: stdout chunk (state=3): >>>import 'site' # <<< 7530 1727096010.76396: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7530 1727096010.76759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 7530 1727096010.76763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7530 1727096010.76783: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 7530 1727096010.76799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.76829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 7530 1727096010.76857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 7530 1727096010.76883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7530 1727096010.76892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 7530 1727096010.77079: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e343ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e343f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7530 1727096010.77113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 7530 1727096010.77169: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e37b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 7530 1727096010.77173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e37bec0> <<< 7530 1727096010.77240: stdout chunk (state=3): >>>import '_collections' # <<< 7530 1727096010.77254: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e35bb60> <<< 7530 1727096010.77259: stdout chunk (state=3): >>>import '_functools' # <<< 7530 1727096010.77295: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3592b0> <<< 7530 1727096010.77424: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e341070> <<< 7530 1727096010.77449: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7530 1727096010.77574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7530 1727096010.77601: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e39b7d0><<< 7530 1727096010.77607: stdout chunk (state=3): >>> <<< 7530 1727096010.77632: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e39a3f0><<< 7530 1727096010.77637: stdout chunk (state=3): >>> <<< 7530 1727096010.77672: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 7530 1727096010.77677: stdout chunk (state=3): >>> <<< 7530 1727096010.77843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e35a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e398bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3402f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3d0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d0bf0> <<< 7530 1727096010.77870: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.77892: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.77898: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3d0fe0><<< 7530 1727096010.77909: stdout chunk (state=3): >>> <<< 7530 1727096010.77925: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e33ee10> <<< 7530 1727096010.77970: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 7530 1727096010.77975: stdout chunk (state=3): >>> <<< 7530 1727096010.77989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.78023: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 7530 1727096010.78030: stdout chunk (state=3): >>> <<< 7530 1727096010.78072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc'<<< 7530 1727096010.78077: stdout chunk (state=3): >>> <<< 7530 1727096010.78101: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d1670><<< 7530 1727096010.78110: stdout chunk (state=3): >>> <<< 7530 1727096010.78119: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d1370> <<< 7530 1727096010.78143: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 7530 1727096010.78150: stdout chunk (state=3): >>> <<< 7530 1727096010.78186: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py<<< 7530 1727096010.78190: stdout chunk (state=3): >>> <<< 7530 1727096010.78224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d2540><<< 7530 1727096010.78228: stdout chunk (state=3): >>> <<< 7530 1727096010.78251: stdout chunk (state=3): >>>import 'importlib.util' # <<< 7530 1727096010.78277: stdout chunk (state=3): >>> import 'runpy' # <<< 7530 1727096010.78356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 7530 1727096010.78392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 7530 1727096010.78411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 7530 1727096010.78430: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3e8740><<< 7530 1727096010.78435: stdout chunk (state=3): >>> <<< 7530 1727096010.78489: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.78507: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.78512: stdout chunk (state=3): >>> <<< 7530 1727096010.78544: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3e9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 7530 1727096010.78550: stdout chunk (state=3): >>> <<< 7530 1727096010.78570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 7530 1727096010.78604: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 7530 1727096010.78610: stdout chunk (state=3): >>> <<< 7530 1727096010.78627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 7530 1727096010.78633: stdout chunk (state=3): >>> <<< 7530 1727096010.78695: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3eacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.78719: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3eb2f0><<< 7530 1727096010.78722: stdout chunk (state=3): >>> <<< 7530 1727096010.78766: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3ea210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 7530 1727096010.78775: stdout chunk (state=3): >>> <<< 7530 1727096010.78788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 7530 1727096010.78835: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.78839: stdout chunk (state=3): >>> <<< 7530 1727096010.78870: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.78873: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3ebd70> <<< 7530 1727096010.78938: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3eb4a0> <<< 7530 1727096010.78961: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d24b0><<< 7530 1727096010.78966: stdout chunk (state=3): >>> <<< 7530 1727096010.78998: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 7530 1727096010.79003: stdout chunk (state=3): >>> <<< 7530 1727096010.79045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 7530 1727096010.79050: stdout chunk (state=3): >>> <<< 7530 1727096010.79103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 7530 1727096010.79107: stdout chunk (state=3): >>> <<< 7530 1727096010.79150: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.79155: stdout chunk (state=3): >>> <<< 7530 1727096010.79192: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e0dfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py<<< 7530 1727096010.79199: stdout chunk (state=3): >>> <<< 7530 1727096010.79207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 7530 1727096010.79252: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.79263: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.79279: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e1087a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e108500><<< 7530 1727096010.79312: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.79319: stdout chunk (state=3): >>> <<< 7530 1727096010.79337: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.79340: stdout chunk (state=3): >>> <<< 7530 1727096010.79381: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e1087d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py<<< 7530 1727096010.79387: stdout chunk (state=3): >>> <<< 7530 1727096010.79406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 7530 1727096010.79496: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.79637: stdout chunk (state=3): >>> <<< 7530 1727096010.79706: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.79713: stdout chunk (state=3): >>> import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e109100><<< 7530 1727096010.79715: stdout chunk (state=3): >>> <<< 7530 1727096010.79910: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096010.79913: stdout chunk (state=3): >>> <<< 7530 1727096010.79932: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.79939: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e109af0><<< 7530 1727096010.79960: stdout chunk (state=3): >>> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1089b0><<< 7530 1727096010.79969: stdout chunk (state=3): >>> <<< 7530 1727096010.79996: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e0dddf0><<< 7530 1727096010.79999: stdout chunk (state=3): >>> <<< 7530 1727096010.80034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 7530 1727096010.80039: stdout chunk (state=3): >>> <<< 7530 1727096010.80075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 7530 1727096010.80080: stdout chunk (state=3): >>> <<< 7530 1727096010.80109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 7530 1727096010.80114: stdout chunk (state=3): >>> <<< 7530 1727096010.80140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 7530 1727096010.80144: stdout chunk (state=3): >>> <<< 7530 1727096010.80190: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e10af00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e109c40><<< 7530 1727096010.80200: stdout chunk (state=3): >>> <<< 7530 1727096010.80229: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d2c60><<< 7530 1727096010.80235: stdout chunk (state=3): >>> <<< 7530 1727096010.80342: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7530 1727096010.80368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.80400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 7530 1727096010.80403: stdout chunk (state=3): >>> <<< 7530 1727096010.80460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 7530 1727096010.80517: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e133230> <<< 7530 1727096010.80634: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 7530 1727096010.80637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.80639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 7530 1727096010.80642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7530 1727096010.80679: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1575f0> <<< 7530 1727096010.80694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 7530 1727096010.80740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7530 1727096010.80789: stdout chunk (state=3): >>>import 'ntpath' # <<< 7530 1727096010.80814: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1b8380> <<< 7530 1727096010.80841: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 7530 1727096010.80862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 7530 1727096010.80888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 7530 1727096010.80921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7530 1727096010.81014: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1baae0> <<< 7530 1727096010.81090: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1b84a0> <<< 7530 1727096010.81120: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e179370> <<< 7530 1727096010.81158: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0dfc5430> <<< 7530 1727096010.81186: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1563f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e10be00> <<< 7530 1727096010.81344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 7530 1727096010.81372: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7eff0e156750> <<< 7530 1727096010.81751: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_taov23ay/ansible_setup_payload.zip' # zipimport: zlib available <<< 7530 1727096010.81877: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.81904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 7530 1727096010.81932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7530 1727096010.81969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7530 1727096010.82051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 7530 1727096010.82082: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e02f170> <<< 7530 1727096010.82095: stdout chunk (state=3): >>>import '_typing' # <<< 7530 1727096010.82281: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e00e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e00d1c0> <<< 7530 1727096010.82305: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.82325: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 7530 1727096010.82358: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.82386: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 7530 1727096010.82396: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.83817: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.84999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e02d010> <<< 7530 1727096010.85034: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.85045: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 7530 1727096010.85073: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 7530 1727096010.85110: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.85121: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e05ea20> <<< 7530 1727096010.85149: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05e7b0> <<< 7530 1727096010.85189: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05e0f0> <<< 7530 1727096010.85213: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 7530 1727096010.85226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7530 1727096010.85260: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05e540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e02fb90> <<< 7530 1727096010.85295: stdout chunk (state=3): >>>import 'atexit' # <<< 7530 1727096010.85298: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e05f7d0> <<< 7530 1727096010.85330: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e05fa10> <<< 7530 1727096010.85349: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 7530 1727096010.85405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 7530 1727096010.85417: stdout chunk (state=3): >>>import '_locale' # <<< 7530 1727096010.85466: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05ff50> <<< 7530 1727096010.85493: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 7530 1727096010.85521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 7530 1727096010.85559: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d931d00> <<< 7530 1727096010.85592: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d933920> <<< 7530 1727096010.85619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 7530 1727096010.85634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 7530 1727096010.85667: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d9342f0> <<< 7530 1727096010.85684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 7530 1727096010.85726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7530 1727096010.85766: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d935490> <<< 7530 1727096010.85772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 7530 1727096010.85798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7530 1727096010.85828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7530 1727096010.85885: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d937f80> <<< 7530 1727096010.85920: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d93c2f0> <<< 7530 1727096010.85954: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d936240> <<< 7530 1727096010.85966: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 7530 1727096010.86004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 7530 1727096010.86014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 7530 1727096010.86044: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7530 1727096010.86170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 7530 1727096010.86197: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93fef0> <<< 7530 1727096010.86221: stdout chunk (state=3): >>>import '_tokenize' # <<< 7530 1727096010.86293: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93e9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93e720> <<< 7530 1727096010.86315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 7530 1727096010.86332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 7530 1727096010.86390: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93ec90> <<< 7530 1727096010.86421: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d936750> <<< 7530 1727096010.86453: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d983f50> <<< 7530 1727096010.86498: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d984230> <<< 7530 1727096010.86513: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 7530 1727096010.86545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 7530 1727096010.86561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7530 1727096010.86605: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d985cd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d985a90> <<< 7530 1727096010.86623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7530 1727096010.86655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7530 1727096010.86709: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d988260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d986390> <<< 7530 1727096010.86734: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 7530 1727096010.86796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.86814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 7530 1727096010.86841: stdout chunk (state=3): >>>import '_string' # <<< 7530 1727096010.86871: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d98b9b0> <<< 7530 1727096010.86997: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d9883b0> <<< 7530 1727096010.87064: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98c7a0> <<< 7530 1727096010.87095: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98c9b0> <<< 7530 1727096010.87147: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98ccb0> <<< 7530 1727096010.87194: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d984350> <<< 7530 1727096010.87197: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 7530 1727096010.87225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 7530 1727096010.87262: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.87285: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d818380> <<< 7530 1727096010.87434: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.87464: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d819430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d98eb10> <<< 7530 1727096010.87498: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98fec0> <<< 7530 1727096010.87533: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d98e750> # zipimport: zlib available <<< 7530 1727096010.87555: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 7530 1727096010.87572: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.87651: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.87758: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.87761: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 7530 1727096010.87810: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 7530 1727096010.87813: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.87930: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.88046: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.88618: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.89200: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 7530 1727096010.89251: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.89298: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d81d760> <<< 7530 1727096010.89401: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 7530 1727096010.89413: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81e450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d819580> <<< 7530 1727096010.89465: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 7530 1727096010.89487: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.89517: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.89528: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 7530 1727096010.89676: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.89831: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 7530 1727096010.89860: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81e600> # zipimport: zlib available <<< 7530 1727096010.90335: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.91050: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.91055: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 7530 1727096010.91102: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.91183: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 7530 1727096010.91211: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 7530 1727096010.91275: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.91287: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.91357: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 7530 1727096010.91562: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.91804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7530 1727096010.91886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 7530 1727096010.92268: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81f890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 7530 1727096010.92281: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.92336: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 7530 1727096010.92358: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.92413: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.92489: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.92595: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096010.92597: stdout chunk (state=3): >>> <<< 7530 1727096010.92699: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 7530 1727096010.92763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096010.92897: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d82a270> <<< 7530 1727096010.92954: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d825ac0> <<< 7530 1727096010.93025: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 7530 1727096010.93113: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.93205: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.93320: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 7530 1727096010.93360: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 7530 1727096010.93384: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 7530 1727096010.93481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 7530 1727096010.93555: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 7530 1727096010.93662: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d902ab0> <<< 7530 1727096010.93756: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d9fe780> <<< 7530 1727096010.93850: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d82a2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81f110> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 7530 1727096010.93938: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 7530 1727096010.94273: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 7530 1727096010.94303: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7530 1727096010.94441: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7530 1727096010.94482: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.94532: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 7530 1727096010.94645: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.94671: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.94787: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.94824: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.94885: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 7530 1727096010.95323: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.95625: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8ba4b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 7530 1727096010.95635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 7530 1727096010.95679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 7530 1727096010.95731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 7530 1727096010.95735: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d528290> <<< 7530 1727096010.95762: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096010.95788: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d5285f0> <<< 7530 1727096010.95834: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8a4230> <<< 7530 1727096010.95860: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8bafc0> <<< 7530 1727096010.95900: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8b8bf0> <<< 7530 1727096010.95992: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8b87d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 7530 1727096010.96012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 7530 1727096010.96047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 7530 1727096010.96077: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d52b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d52ae40> <<< 7530 1727096010.96113: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d52afc0> <<< 7530 1727096010.96247: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d52a2a0> <<< 7530 1727096010.96270: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 7530 1727096010.96297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d52b680> <<< 7530 1727096010.96312: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 7530 1727096010.96381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d576180> <<< 7530 1727096010.96405: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d5741a0> <<< 7530 1727096010.96445: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8b8860> import 'ansible.module_utils.facts.timeout' # <<< 7530 1727096010.96504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 7530 1727096010.96520: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 7530 1727096010.96604: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.96626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 7530 1727096010.96646: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.96698: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.96745: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 7530 1727096010.96784: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 7530 1727096010.96914: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096010.96922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 7530 1727096010.96960: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 7530 1727096010.96991: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.97161: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # <<< 7530 1727096010.97176: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.97190: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096010.97245: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.97310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 7530 1727096010.97355: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.98213: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.98905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 7530 1727096010.98909: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.98983: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.99091: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096010.99152: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 7530 1727096010.99201: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096010.99255: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 7530 1727096010.99276: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.99351: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.99411: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 7530 1727096010.99443: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.99526: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 7530 1727096010.99530: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.99566: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.99616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 7530 1727096010.99737: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096010.99876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 7530 1727096010.99936: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d577fb0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 7530 1727096010.99974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 7530 1727096011.00189: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d576db0> import 'ansible.module_utils.facts.system.local' # <<< 7530 1727096011.00281: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.00380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 7530 1727096011.00395: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.00406: stdout chunk (state=3): >>> <<< 7530 1727096011.00534: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.00683: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 7530 1727096011.00687: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.00689: stdout chunk (state=3): >>> <<< 7530 1727096011.00783: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.00890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 7530 1727096011.00894: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.00955: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.01026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 7530 1727096011.01100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 7530 1727096011.01194: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.01291: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d5b63c0> <<< 7530 1727096011.01960: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d988200> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.02028: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.02208: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.02457: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 7530 1727096011.02461: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.02507: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.02573: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 7530 1727096011.02636: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.02726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 7530 1727096011.02791: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.02794: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d5c9f70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d5a6f60> <<< 7530 1727096011.02847: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 7530 1727096011.02902: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.02950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 7530 1727096011.02959: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03218: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 7530 1727096011.03475: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03634: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03775: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03879: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 7530 1727096011.03914: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03933: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.03961: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.04104: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.04258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 7530 1727096011.04272: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 7530 1727096011.04385: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.04507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 7530 1727096011.04527: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.04549: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.04586: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.05394: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.06236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 7530 1727096011.06271: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available<<< 7530 1727096011.06351: stdout chunk (state=3): >>> <<< 7530 1727096011.06460: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.06466: stdout chunk (state=3): >>> <<< 7530 1727096011.06626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 7530 1727096011.06632: stdout chunk (state=3): >>> <<< 7530 1727096011.06658: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.06664: stdout chunk (state=3): >>> <<< 7530 1727096011.06818: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.06822: stdout chunk (state=3): >>> <<< 7530 1727096011.06975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 7530 1727096011.07005: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7530 1727096011.07012: stdout chunk (state=3): >>> <<< 7530 1727096011.07258: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.07619: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 7530 1727096011.07744: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 7530 1727096011.07778: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.07787: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.07850: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.08211: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.08384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.08388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 7530 1727096011.08439: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 7530 1727096011.08462: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.08516: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.08610: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 7530 1727096011.08634: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 7530 1727096011.08714: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.08899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 7530 1727096011.09483: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.09570: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 7530 1727096011.09593: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.09658: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.09749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 7530 1727096011.09753: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.09795: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.09832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 7530 1727096011.09846: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.09883: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.09923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 7530 1727096011.09935: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.10013: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 7530 1727096011.10025: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.10136: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.10244: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 7530 1727096011.10273: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.10299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 7530 1727096011.10437: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.10522: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.10586: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.10700: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.10789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 7530 1727096011.10805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 7530 1727096011.10810: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.10880: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.10941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 7530 1727096011.10959: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.11354: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.11587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 7530 1727096011.11653: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.11732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 7530 1727096011.11786: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.11854: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 7530 1727096011.12108: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 7530 1727096011.12224: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.12350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 7530 1727096011.12435: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.12624: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 7530 1727096011.12647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 7530 1727096011.12685: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.12703: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d3c70e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d3c4830> <<< 7530 1727096011.12744: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d3c51f0> <<< 7530 1727096011.14513: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "53", "second": "31", "epoch": "1727096011", "epoch_int": "1727096011", "date": "2024-09-23", "time": "08:53:31", "iso8601_micro": "2024-09-23T12:53:31.139423Z", "iso8601": "2024-09-23T12:53:31Z", "iso8601_basic": "20240923T085331139423", "iso8601_basic_short": "20240923T085331", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhS<<< 7530 1727096011.14529: stdout chunk (state=3): >>>SiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7530 1727096011.15209: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 7530 1727096011.15280: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 7530 1727096011.15329: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token <<< 7530 1727096011.15356: stdout chunk (state=3): >>># destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array <<< 7530 1727096011.15379: stdout chunk (state=3): >>># cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes<<< 7530 1727096011.15392: stdout chunk (state=3): >>> # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast <<< 7530 1727096011.15425: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file<<< 7530 1727096011.15434: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils <<< 7530 1727096011.15572: stdout chunk (state=3): >>># destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic <<< 7530 1727096011.15602: stdout chunk (state=3): >>># cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dr<<< 7530 1727096011.15614: stdout chunk (state=3): >>>agonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd<<< 7530 1727096011.15636: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn<<< 7530 1727096011.15648: stdout chunk (state=3): >>> <<< 7530 1727096011.15659: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme<<< 7530 1727096011.15683: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd<<< 7530 1727096011.15837: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 7530 1727096011.16280: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 7530 1727096011.16289: stdout chunk (state=3): >>> <<< 7530 1727096011.16315: stdout chunk (state=3): >>># destroy importlib.machinery <<< 7530 1727096011.16336: stdout chunk (state=3): >>># destroy importlib._abc<<< 7530 1727096011.16346: stdout chunk (state=3): >>> <<< 7530 1727096011.16356: stdout chunk (state=3): >>># destroy importlib.util <<< 7530 1727096011.16393: stdout chunk (state=3): >>># destroy _bz2<<< 7530 1727096011.16398: stdout chunk (state=3): >>> <<< 7530 1727096011.16426: stdout chunk (state=3): >>># destroy _compression <<< 7530 1727096011.16432: stdout chunk (state=3): >>># destroy _lzma <<< 7530 1727096011.16456: stdout chunk (state=3): >>># destroy _blake2<<< 7530 1727096011.16465: stdout chunk (state=3): >>> <<< 7530 1727096011.16489: stdout chunk (state=3): >>># destroy binascii <<< 7530 1727096011.16503: stdout chunk (state=3): >>># destroy zlib<<< 7530 1727096011.16509: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma<<< 7530 1727096011.16549: stdout chunk (state=3): >>> # destroy zipfile._path # destroy zipfile<<< 7530 1727096011.16558: stdout chunk (state=3): >>> <<< 7530 1727096011.16574: stdout chunk (state=3): >>># destroy pathlib<<< 7530 1727096011.16582: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy ipaddress<<< 7530 1727096011.16637: stdout chunk (state=3): >>> # destroy ntpath<<< 7530 1727096011.16641: stdout chunk (state=3): >>> <<< 7530 1727096011.16669: stdout chunk (state=3): >>># destroy importlib<<< 7530 1727096011.16681: stdout chunk (state=3): >>> <<< 7530 1727096011.16692: stdout chunk (state=3): >>># destroy zipimport <<< 7530 1727096011.16716: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 7530 1727096011.16734: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner<<< 7530 1727096011.16753: stdout chunk (state=3): >>> # destroy _json<<< 7530 1727096011.16765: stdout chunk (state=3): >>> # destroy grp # destroy encodings<<< 7530 1727096011.16786: stdout chunk (state=3): >>> # destroy _locale<<< 7530 1727096011.16818: stdout chunk (state=3): >>> # destroy locale <<< 7530 1727096011.16831: stdout chunk (state=3): >>># destroy select<<< 7530 1727096011.16834: stdout chunk (state=3): >>> # destroy _signal<<< 7530 1727096011.16857: stdout chunk (state=3): >>> # destroy _posixsubprocess <<< 7530 1727096011.16926: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux<<< 7530 1727096011.16938: stdout chunk (state=3): >>> <<< 7530 1727096011.16947: stdout chunk (state=3): >>># destroy shutil <<< 7530 1727096011.16981: stdout chunk (state=3): >>># destroy distro<<< 7530 1727096011.16992: stdout chunk (state=3): >>> # destroy distro.distro<<< 7530 1727096011.17003: stdout chunk (state=3): >>> <<< 7530 1727096011.17057: stdout chunk (state=3): >>># destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors<<< 7530 1727096011.17078: stdout chunk (state=3): >>> <<< 7530 1727096011.17081: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector <<< 7530 1727096011.17106: stdout chunk (state=3): >>># destroy multiprocessing<<< 7530 1727096011.17114: stdout chunk (state=3): >>> <<< 7530 1727096011.17131: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy multiprocessing.pool<<< 7530 1727096011.17141: stdout chunk (state=3): >>> # destroy signal # destroy pickle<<< 7530 1727096011.17180: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle<<< 7530 1727096011.17196: stdout chunk (state=3): >>> <<< 7530 1727096011.17212: stdout chunk (state=3): >>># destroy queue<<< 7530 1727096011.17225: stdout chunk (state=3): >>> # destroy _heapq # destroy _queue<<< 7530 1727096011.17233: stdout chunk (state=3): >>> # destroy multiprocessing.process <<< 7530 1727096011.17259: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors<<< 7530 1727096011.17304: stdout chunk (state=3): >>> # destroy _multiprocessing # destroy shlex<<< 7530 1727096011.17309: stdout chunk (state=3): >>> # destroy fcntl<<< 7530 1727096011.17337: stdout chunk (state=3): >>> # destroy datetime<<< 7530 1727096011.17361: stdout chunk (state=3): >>> # destroy subprocess<<< 7530 1727096011.17368: stdout chunk (state=3): >>> <<< 7530 1727096011.17414: stdout chunk (state=3): >>># destroy base64 # destroy _ssl<<< 7530 1727096011.17424: stdout chunk (state=3): >>> <<< 7530 1727096011.17454: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 7530 1727096011.17473: stdout chunk (state=3): >>> # destroy getpass<<< 7530 1727096011.17503: stdout chunk (state=3): >>> # destroy pwd # destroy termios # destroy errno<<< 7530 1727096011.17510: stdout chunk (state=3): >>> <<< 7530 1727096011.17553: stdout chunk (state=3): >>># destroy json # destroy socket<<< 7530 1727096011.17571: stdout chunk (state=3): >>> <<< 7530 1727096011.17574: stdout chunk (state=3): >>># destroy struct<<< 7530 1727096011.17583: stdout chunk (state=3): >>> <<< 7530 1727096011.17596: stdout chunk (state=3): >>># destroy glob<<< 7530 1727096011.17615: stdout chunk (state=3): >>> # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout<<< 7530 1727096011.17686: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna<<< 7530 1727096011.17702: stdout chunk (state=3): >>> <<< 7530 1727096011.17715: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser<<< 7530 1727096011.17729: stdout chunk (state=3): >>> <<< 7530 1727096011.17732: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux<<< 7530 1727096011.17746: stdout chunk (state=3): >>> <<< 7530 1727096011.17766: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes<<< 7530 1727096011.17780: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser<<< 7530 1727096011.17789: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 7530 1727096011.17804: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string<<< 7530 1727096011.17819: stdout chunk (state=3): >>> # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 7530 1727096011.17839: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize <<< 7530 1727096011.17866: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 7530 1727096011.17892: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 7530 1727096011.17901: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random<<< 7530 1727096011.17922: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 7530 1727096011.17942: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 7530 1727096011.17974: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 7530 1727096011.17985: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 7530 1727096011.18009: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 7530 1727096011.18021: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc <<< 7530 1727096011.18043: stdout chunk (state=3): >>># cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 7530 1727096011.18060: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath<<< 7530 1727096011.18089: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 7530 1727096011.18098: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases<<< 7530 1727096011.18125: stdout chunk (state=3): >>> # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 7530 1727096011.18142: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 7530 1727096011.18172: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 7530 1727096011.18197: stdout chunk (state=3): >>> # destroy selinux._selinux<<< 7530 1727096011.18204: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128<<< 7530 1727096011.18335: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7530 1727096011.18510: stdout chunk (state=3): >>># destroy sys.monitoring <<< 7530 1727096011.18531: stdout chunk (state=3): >>># destroy _socket<<< 7530 1727096011.18537: stdout chunk (state=3): >>> <<< 7530 1727096011.18574: stdout chunk (state=3): >>># destroy _collections <<< 7530 1727096011.18623: stdout chunk (state=3): >>># destroy platform<<< 7530 1727096011.18634: stdout chunk (state=3): >>> <<< 7530 1727096011.18652: stdout chunk (state=3): >>># destroy _uuid <<< 7530 1727096011.18672: stdout chunk (state=3): >>># destroy stat<<< 7530 1727096011.18676: stdout chunk (state=3): >>> # destroy genericpath <<< 7530 1727096011.18691: stdout chunk (state=3): >>># destroy re._parser<<< 7530 1727096011.18701: stdout chunk (state=3): >>> <<< 7530 1727096011.18747: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib<<< 7530 1727096011.18763: stdout chunk (state=3): >>> <<< 7530 1727096011.18771: stdout chunk (state=3): >>># destroy copyreg <<< 7530 1727096011.18816: stdout chunk (state=3): >>># destroy contextlib # destroy _typing<<< 7530 1727096011.18824: stdout chunk (state=3): >>> <<< 7530 1727096011.18849: stdout chunk (state=3): >>># destroy _tokenize <<< 7530 1727096011.18859: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse<<< 7530 1727096011.19042: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 7530 1727096011.19076: stdout chunk (state=3): >>># destroy codecs<<< 7530 1727096011.19093: stdout chunk (state=3): >>> <<< 7530 1727096011.19098: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8<<< 7530 1727096011.19120: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig<<< 7530 1727096011.19135: stdout chunk (state=3): >>> <<< 7530 1727096011.19154: stdout chunk (state=3): >>># destroy encodings.cp437<<< 7530 1727096011.19160: stdout chunk (state=3): >>> # destroy encodings.idna<<< 7530 1727096011.19180: stdout chunk (state=3): >>> <<< 7530 1727096011.19191: stdout chunk (state=3): >>># destroy _codecs # destroy io<<< 7530 1727096011.19218: stdout chunk (state=3): >>> # destroy traceback<<< 7530 1727096011.19232: stdout chunk (state=3): >>> # destroy warnings<<< 7530 1727096011.19246: stdout chunk (state=3): >>> <<< 7530 1727096011.19252: stdout chunk (state=3): >>># destroy weakref <<< 7530 1727096011.19276: stdout chunk (state=3): >>># destroy collections<<< 7530 1727096011.19280: stdout chunk (state=3): >>> # destroy threading # destroy atexit<<< 7530 1727096011.19297: stdout chunk (state=3): >>> # destroy _warnings # destroy math<<< 7530 1727096011.19312: stdout chunk (state=3): >>> # destroy _bisect # destroy time<<< 7530 1727096011.19354: stdout chunk (state=3): >>> # destroy _random<<< 7530 1727096011.19365: stdout chunk (state=3): >>> <<< 7530 1727096011.19372: stdout chunk (state=3): >>># destroy _weakref<<< 7530 1727096011.19407: stdout chunk (state=3): >>> # destroy _hashlib <<< 7530 1727096011.19438: stdout chunk (state=3): >>># destroy _operator<<< 7530 1727096011.19454: stdout chunk (state=3): >>> # destroy _sre # destroy _string<<< 7530 1727096011.19485: stdout chunk (state=3): >>> # destroy re # destroy itertools <<< 7530 1727096011.19523: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 7530 1727096011.19545: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 7530 1727096011.19738: stdout chunk (state=3): >>> <<< 7530 1727096011.20132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096011.20172: stderr chunk (state=3): >>><<< 7530 1727096011.20176: stdout chunk (state=3): >>><<< 7530 1727096011.20282: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e5104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e4dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e512a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e305130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e305fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e343ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e343f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e37b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e37bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e35bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3592b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e341070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e39b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e39a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e35a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e398bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3402f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3d0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3d0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e33ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3e8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3e9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3eacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3eb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3ea210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e3ebd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3eb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e0dfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e1087a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e108500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e1087d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e109100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e109af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1089b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e0dddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e10af00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e109c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e3d2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e133230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1575f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1b8380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1baae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1b84a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e179370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0dfc5430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e1563f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e10be00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7eff0e156750> # zipimport: found 103 names in '/tmp/ansible_setup_payload_taov23ay/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e02f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e00e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e00d1c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e02d010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e05ea20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05e7b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05e0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05e540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e02fb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e05f7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0e05fa10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0e05ff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d931d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d933920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d9342f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d935490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d937f80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d93c2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d936240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93fef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93e9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93e720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d93ec90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d936750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d983f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d984230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d985cd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d985a90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d988260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d986390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d98b9b0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d9883b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98c7a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98c9b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98ccb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d984350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d818380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d819430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d98eb10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d98fec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d98e750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d81d760> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81e450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d819580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81e600> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81f890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d82a270> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d825ac0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d902ab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d9fe780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d82a2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d81f110> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8ba4b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d528290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d5285f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8a4230> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8bafc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8b8bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8b87d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d52b560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d52ae40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d52afc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d52a2a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d52b680> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d576180> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d5741a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d8b8860> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d577fb0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d576db0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d5b63c0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d988200> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d5c9f70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d5a6f60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7eff0d3c70e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d3c4830> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7eff0d3c51f0> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "53", "second": "31", "epoch": "1727096011", "epoch_int": "1727096011", "date": "2024-09-23", "time": "08:53:31", "iso8601_micro": "2024-09-23T12:53:31.139423Z", "iso8601": "2024-09-23T12:53:31Z", "iso8601_basic": "20240923T085331139423", "iso8601_basic_short": "20240923T085331", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 7530 1727096011.21106: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096011.21110: _low_level_execute_command(): starting 7530 1727096011.21125: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096010.5732653-7589-112236536380905/ > /dev/null 2>&1 && sleep 0' 7530 1727096011.21128: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096011.21131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096011.21134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096011.21136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096011.21138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096011.21141: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096011.21143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.21145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096011.21147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096011.21149: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096011.21156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096011.21173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096011.21176: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.21226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096011.21229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096011.21249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096011.21302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096011.23939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096011.23963: stderr chunk (state=3): >>><<< 7530 1727096011.23966: stdout chunk (state=3): >>><<< 7530 1727096011.23986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096011.23992: handler run complete 7530 1727096011.24027: variable 'ansible_facts' from source: unknown 7530 1727096011.24066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.24140: variable 'ansible_facts' from source: unknown 7530 1727096011.24172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.24206: attempt loop complete, returning result 7530 1727096011.24209: _execute() done 7530 1727096011.24211: dumping result to json 7530 1727096011.24221: done dumping result, returning 7530 1727096011.24228: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-086b-f4f0-000000000166] 7530 1727096011.24232: sending task result for task 0afff68d-5257-086b-f4f0-000000000166 7530 1727096011.24366: done sending task result for task 0afff68d-5257-086b-f4f0-000000000166 7530 1727096011.24371: WORKER PROCESS EXITING ok: [managed_node3] 7530 1727096011.24469: no more pending results, returning what we have 7530 1727096011.24472: results queue empty 7530 1727096011.24473: checking for any_errors_fatal 7530 1727096011.24474: done checking for any_errors_fatal 7530 1727096011.24475: checking for max_fail_percentage 7530 1727096011.24476: done checking for max_fail_percentage 7530 1727096011.24477: checking to see if all hosts have failed and the running result is not ok 7530 1727096011.24478: done checking to see if all hosts have failed 7530 1727096011.24478: getting the remaining hosts for this loop 7530 1727096011.24480: done getting the remaining hosts for this loop 7530 1727096011.24483: getting the next task for host managed_node3 7530 1727096011.24490: done getting next task for host managed_node3 7530 1727096011.24493: ^ task is: TASK: Check if system is ostree 7530 1727096011.24495: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096011.24498: getting variables 7530 1727096011.24500: in VariableManager get_vars() 7530 1727096011.24526: Calling all_inventory to load vars for managed_node3 7530 1727096011.24528: Calling groups_inventory to load vars for managed_node3 7530 1727096011.24531: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096011.24540: Calling all_plugins_play to load vars for managed_node3 7530 1727096011.24542: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096011.24545: Calling groups_plugins_play to load vars for managed_node3 7530 1727096011.24689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.24802: done with get_vars() 7530 1727096011.24809: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:53:31 -0400 (0:00:00.760) 0:00:02.037 ****** 7530 1727096011.24879: entering _queue_task() for managed_node3/stat 7530 1727096011.25082: worker is 1 (out of 1 available) 7530 1727096011.25095: exiting _queue_task() for managed_node3/stat 7530 1727096011.25106: done queuing things up, now waiting for results queue to drain 7530 1727096011.25108: waiting for pending results... 7530 1727096011.25252: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 7530 1727096011.25320: in run() - task 0afff68d-5257-086b-f4f0-000000000168 7530 1727096011.25334: variable 'ansible_search_path' from source: unknown 7530 1727096011.25338: variable 'ansible_search_path' from source: unknown 7530 1727096011.25364: calling self._execute() 7530 1727096011.25422: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.25429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.25441: variable 'omit' from source: magic vars 7530 1727096011.25780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096011.25963: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096011.26000: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096011.26026: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096011.26051: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096011.26136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096011.26153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096011.26172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096011.26190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096011.26287: Evaluated conditional (not __network_is_ostree is defined): True 7530 1727096011.26291: variable 'omit' from source: magic vars 7530 1727096011.26323: variable 'omit' from source: magic vars 7530 1727096011.26349: variable 'omit' from source: magic vars 7530 1727096011.26371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096011.26391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096011.26406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096011.26423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096011.26429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096011.26453: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096011.26457: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.26459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.26526: Set connection var ansible_pipelining to False 7530 1727096011.26529: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096011.26538: Set connection var ansible_timeout to 10 7530 1727096011.26544: Set connection var ansible_shell_executable to /bin/sh 7530 1727096011.26548: Set connection var ansible_shell_type to sh 7530 1727096011.26550: Set connection var ansible_connection to ssh 7530 1727096011.26571: variable 'ansible_shell_executable' from source: unknown 7530 1727096011.26574: variable 'ansible_connection' from source: unknown 7530 1727096011.26577: variable 'ansible_module_compression' from source: unknown 7530 1727096011.26579: variable 'ansible_shell_type' from source: unknown 7530 1727096011.26581: variable 'ansible_shell_executable' from source: unknown 7530 1727096011.26584: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.26586: variable 'ansible_pipelining' from source: unknown 7530 1727096011.26590: variable 'ansible_timeout' from source: unknown 7530 1727096011.26595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.26700: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096011.26708: variable 'omit' from source: magic vars 7530 1727096011.26713: starting attempt loop 7530 1727096011.26716: running the handler 7530 1727096011.26728: _low_level_execute_command(): starting 7530 1727096011.26735: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096011.27243: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096011.27248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096011.27252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.27308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096011.27312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096011.27320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096011.27363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096011.29788: stdout chunk (state=3): >>>/root <<< 7530 1727096011.29922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096011.29954: stderr chunk (state=3): >>><<< 7530 1727096011.29957: stdout chunk (state=3): >>><<< 7530 1727096011.29982: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096011.29996: _low_level_execute_command(): starting 7530 1727096011.30002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301 `" && echo ansible-tmp-1727096011.2998288-7628-25049707243301="` echo /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301 `" ) && sleep 0' 7530 1727096011.30442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096011.30446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096011.30477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096011.30480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.30483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096011.30485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.30544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096011.30547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096011.30550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096011.30599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096011.33396: stdout chunk (state=3): >>>ansible-tmp-1727096011.2998288-7628-25049707243301=/root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301 <<< 7530 1727096011.33563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096011.33594: stderr chunk (state=3): >>><<< 7530 1727096011.33597: stdout chunk (state=3): >>><<< 7530 1727096011.33612: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096011.2998288-7628-25049707243301=/root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096011.33663: variable 'ansible_module_compression' from source: unknown 7530 1727096011.33709: ANSIBALLZ: Using lock for stat 7530 1727096011.33712: ANSIBALLZ: Acquiring lock 7530 1727096011.33714: ANSIBALLZ: Lock acquired: 139837168145696 7530 1727096011.33716: ANSIBALLZ: Creating module 7530 1727096011.45875: ANSIBALLZ: Writing module into payload 7530 1727096011.45928: ANSIBALLZ: Writing module 7530 1727096011.45955: ANSIBALLZ: Renaming module 7530 1727096011.45966: ANSIBALLZ: Done creating module 7530 1727096011.45990: variable 'ansible_facts' from source: unknown 7530 1727096011.46073: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/AnsiballZ_stat.py 7530 1727096011.46400: Sending initial data 7530 1727096011.46404: Sent initial data (150 bytes) 7530 1727096011.46947: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096011.46951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096011.46954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.47022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096011.47036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096011.47086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096011.48786: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096011.48818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096011.48847: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyv0kt1hz /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/AnsiballZ_stat.py <<< 7530 1727096011.48878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/AnsiballZ_stat.py" <<< 7530 1727096011.48915: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyv0kt1hz" to remote "/root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/AnsiballZ_stat.py" <<< 7530 1727096011.49628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096011.49672: stderr chunk (state=3): >>><<< 7530 1727096011.49681: stdout chunk (state=3): >>><<< 7530 1727096011.49729: done transferring module to remote 7530 1727096011.49755: _low_level_execute_command(): starting 7530 1727096011.49764: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/ /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/AnsiballZ_stat.py && sleep 0' 7530 1727096011.50503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.50553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096011.50584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096011.50608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096011.50687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096011.52606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096011.52638: stderr chunk (state=3): >>><<< 7530 1727096011.52641: stdout chunk (state=3): >>><<< 7530 1727096011.52741: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096011.52745: _low_level_execute_command(): starting 7530 1727096011.52747: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/AnsiballZ_stat.py && sleep 0' 7530 1727096011.53361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096011.53381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096011.53397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096011.53430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096011.53448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096011.53535: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.53574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096011.53600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096011.53681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096011.55966: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 7530 1727096011.56007: stdout chunk (state=3): >>>import _imp # builtin <<< 7530 1727096011.56034: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 7530 1727096011.56083: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7530 1727096011.56116: stdout chunk (state=3): >>>import 'posix' # <<< 7530 1727096011.56169: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 7530 1727096011.56188: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 7530 1727096011.56236: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 7530 1727096011.56260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096011.56288: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 7530 1727096011.56307: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 7530 1727096011.56351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 7530 1727096011.56358: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f98bb00> <<< 7530 1727096011.56417: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9bea50> <<< 7530 1727096011.56443: stdout chunk (state=3): >>>import '_signal' # <<< 7530 1727096011.56446: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 7530 1727096011.56462: stdout chunk (state=3): >>>import 'io' # <<< 7530 1727096011.56499: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 7530 1727096011.56584: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7530 1727096011.56620: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 7530 1727096011.56852: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9cdfa0> <<< 7530 1727096011.56892: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7530 1727096011.57268: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7530 1727096011.57294: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096011.57364: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 7530 1727096011.57392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7530 1727096011.57415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 7530 1727096011.57436: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7abe90> <<< 7530 1727096011.57452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 7530 1727096011.57501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 7530 1727096011.57526: stdout chunk (state=3): >>>import '_operator' # <<< 7530 1727096011.57531: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7abf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 7530 1727096011.57582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7530 1727096011.57648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096011.57666: stdout chunk (state=3): >>>import 'itertools' # <<< 7530 1727096011.57722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7e3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 7530 1727096011.57734: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7e3f20> <<< 7530 1727096011.57813: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7c3b60> <<< 7530 1727096011.57829: stdout chunk (state=3): >>>import '_functools' # <<< 7530 1727096011.57858: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7c1280> <<< 7530 1727096011.57994: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7a9040> <<< 7530 1727096011.58023: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7530 1727096011.58049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 7530 1727096011.58075: stdout chunk (state=3): >>>import '_sre' # <<< 7530 1727096011.58119: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 7530 1727096011.58143: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 7530 1727096011.58150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7530 1727096011.58200: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f803800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f802420> <<< 7530 1727096011.58233: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 7530 1727096011.58258: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7c2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f800b60> <<< 7530 1727096011.58300: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 7530 1727096011.58311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f838830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7a82c0> <<< 7530 1727096011.58347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 7530 1727096011.58409: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f838ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f838b90> <<< 7530 1727096011.58458: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f838f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7a6de0> <<< 7530 1727096011.58509: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 7530 1727096011.58546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 7530 1727096011.58559: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f8395b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f8392b0> <<< 7530 1727096011.58607: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 7530 1727096011.58622: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f83a4b0> <<< 7530 1727096011.58647: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 7530 1727096011.58719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 7530 1727096011.58736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 7530 1727096011.58832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 7530 1727096011.58850: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f850680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f851d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 7530 1727096011.58853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 7530 1727096011.58909: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f852c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f853290> <<< 7530 1727096011.58952: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f852180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 7530 1727096011.59050: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f853d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f853440> <<< 7530 1727096011.59066: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f83a450> <<< 7530 1727096011.59095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 7530 1727096011.59128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 7530 1727096011.59152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 7530 1727096011.59316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5cfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f8770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5f84d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f86e0> <<< 7530 1727096011.59321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 7530 1727096011.59413: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.59599: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.59604: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f90a0> <<< 7530 1727096011.59843: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f9a90> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5f8950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5cddf0> <<< 7530 1727096011.59857: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7530 1727096011.59872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 7530 1727096011.59909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 7530 1727096011.59919: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5fae70> <<< 7530 1727096011.59948: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5f9bb0> <<< 7530 1727096011.59963: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f83aba0> <<< 7530 1727096011.59988: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7530 1727096011.60178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096011.60278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f61f1d0> <<< 7530 1727096011.60371: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7530 1727096011.60446: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f647560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7530 1727096011.60454: stdout chunk (state=3): >>>import 'ntpath' # <<< 7530 1727096011.60479: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f6a82c0> <<< 7530 1727096011.60521: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 7530 1727096011.60530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 7530 1727096011.60560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 7530 1727096011.60627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7530 1727096011.60705: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f6aaa20> <<< 7530 1727096011.60813: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f6a83e0> <<< 7530 1727096011.60827: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f66d2e0> <<< 7530 1727096011.60864: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef253d0> <<< 7530 1727096011.60906: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f646360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5fbda0> <<< 7530 1727096011.61014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 7530 1727096011.61037: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2d2f646960> <<< 7530 1727096011.61214: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_1mxr8usw/ansible_stat_payload.zip' <<< 7530 1727096011.61226: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.61350: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.61381: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 7530 1727096011.61398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7530 1727096011.61762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef7b0e0> import '_typing' # <<< 7530 1727096011.61874: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef59fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef59160> <<< 7530 1727096011.61887: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.61926: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 7530 1727096011.61950: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.61978: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 7530 1727096011.62044: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.64471: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.65799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef78fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 7530 1727096011.65815: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2efa2b40> <<< 7530 1727096011.65855: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa28d0> <<< 7530 1727096011.65909: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa21e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7530 1727096011.65987: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa2c30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef7bd70> import 'atexit' # <<< 7530 1727096011.66071: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2efa3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2efa3a40> <<< 7530 1727096011.66098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 7530 1727096011.66173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 7530 1727096011.66242: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa3f50> import 'pwd' # <<< 7530 1727096011.66308: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 7530 1727096011.66353: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee0dc40> <<< 7530 1727096011.66406: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.66438: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee0f860> <<< 7530 1727096011.66456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 7530 1727096011.66495: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee10260> <<< 7530 1727096011.66527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 7530 1727096011.66580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7530 1727096011.66596: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee11130> <<< 7530 1727096011.66674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7530 1727096011.66700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7530 1727096011.66780: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee13e30> <<< 7530 1727096011.66825: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f7a6ed0> <<< 7530 1727096011.66861: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee12120> <<< 7530 1727096011.66933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 7530 1727096011.66957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 7530 1727096011.67007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7530 1727096011.67065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 7530 1727096011.67086: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1bec0> <<< 7530 1727096011.67162: stdout chunk (state=3): >>>import '_tokenize' # <<< 7530 1727096011.67201: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1a990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1a6f0> <<< 7530 1727096011.67237: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 7530 1727096011.67398: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1ac60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee12630> <<< 7530 1727096011.67432: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee63f50> <<< 7530 1727096011.67560: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee63f80> <<< 7530 1727096011.67605: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 7530 1727096011.67632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7530 1727096011.67690: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee65b20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee658e0> <<< 7530 1727096011.67778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7530 1727096011.67945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7530 1727096011.68027: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096011.68030: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee680e0><<< 7530 1727096011.68049: stdout chunk (state=3): >>> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee66210><<< 7530 1727096011.68086: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 7530 1727096011.68099: stdout chunk (state=3): >>> <<< 7530 1727096011.68163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096011.68215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 7530 1727096011.68303: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee6b830> <<< 7530 1727096011.68526: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee68200> <<< 7530 1727096011.68621: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.68654: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6c650> <<< 7530 1727096011.68799: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6c680> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.68803: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.68832: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6cc50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee641a0> <<< 7530 1727096011.68863: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 7530 1727096011.68905: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 7530 1727096011.68949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 7530 1727096011.69153: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2eef8380> <<< 7530 1727096011.69300: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.69325: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2eef9970> <<< 7530 1727096011.69354: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee6eb10> <<< 7530 1727096011.69396: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 7530 1727096011.69433: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6fec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee6e750> <<< 7530 1727096011.69478: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.69510: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.69531: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 7530 1727096011.69566: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.69691: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.69838: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.69884: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 7530 1727096011.69913: stdout chunk (state=3): >>> <<< 7530 1727096011.69931: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.69957: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 7530 1727096011.70137: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.70261: stdout chunk (state=3): >>> <<< 7530 1727096011.70337: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.71278: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.72249: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 7530 1727096011.72266: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 7530 1727096011.72287: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 7530 1727096011.72295: stdout chunk (state=3): >>> <<< 7530 1727096011.72302: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 7530 1727096011.72342: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 7530 1727096011.72383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 7530 1727096011.72391: stdout chunk (state=3): >>> <<< 7530 1727096011.72466: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2eefdaf0><<< 7530 1727096011.72540: stdout chunk (state=3): >>> <<< 7530 1727096011.72622: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 7530 1727096011.72661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 7530 1727096011.72692: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eefe750> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eef9b80> <<< 7530 1727096011.72759: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 7530 1727096011.72815: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7530 1727096011.72820: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.72842: stdout chunk (state=3): >>> import 'ansible.module_utils._text' # <<< 7530 1727096011.73145: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.73173: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.73382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 7530 1727096011.73404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 7530 1727096011.73424: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eefe450> <<< 7530 1727096011.73489: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.74270: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.74921: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.75037: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.75160: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 7530 1727096011.75190: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7530 1727096011.75239: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.75308: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 7530 1727096011.75323: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.75549: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 7530 1727096011.75600: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7530 1727096011.75605: stdout chunk (state=3): >>> <<< 7530 1727096011.75652: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 7530 1727096011.75655: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.75670: stdout chunk (state=3): >>> <<< 7530 1727096011.75717: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.75929: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 7530 1727096011.76219: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.76597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 7530 1727096011.76704: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 7530 1727096011.76742: stdout chunk (state=3): >>>import '_ast' # <<< 7530 1727096011.77012: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eeff980> # zipimport: zlib available <<< 7530 1727096011.77016: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.77126: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 7530 1727096011.77152: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 7530 1727096011.77186: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 7530 1727096011.77278: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.77341: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 7530 1727096011.77486: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.77537: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.77583: stdout chunk (state=3): >>> # zipimport: zlib available <<< 7530 1727096011.77703: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 7530 1727096011.77761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 7530 1727096011.77766: stdout chunk (state=3): >>> <<< 7530 1727096011.77901: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096011.77905: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 7530 1727096011.77931: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ed0a450> <<< 7530 1727096011.77987: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ed05100><<< 7530 1727096011.78040: stdout chunk (state=3): >>> import 'ansible.module_utils.common.file' # <<< 7530 1727096011.78055: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 7530 1727096011.78076: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.78177: stdout chunk (state=3): >>> # zipimport: zlib available<<< 7530 1727096011.78184: stdout chunk (state=3): >>> <<< 7530 1727096011.78275: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.78320: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.78323: stdout chunk (state=3): >>> <<< 7530 1727096011.78390: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 7530 1727096011.78402: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 7530 1727096011.78439: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 7530 1727096011.78444: stdout chunk (state=3): >>> <<< 7530 1727096011.78484: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 7530 1727096011.78604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 7530 1727096011.78639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 7530 1727096011.78677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 7530 1727096011.78775: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2effec90> <<< 7530 1727096011.78849: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efee960> <<< 7530 1727096011.78971: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ed0a4e0> <<< 7530 1727096011.78991: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ed007d0> <<< 7530 1727096011.79001: stdout chunk (state=3): >>># destroy ansible.module_utils.distro<<< 7530 1727096011.79029: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 7530 1727096011.79035: stdout chunk (state=3): >>> <<< 7530 1727096011.79073: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.79135: stdout chunk (state=3): >>> import 'ansible.module_utils.common._utils' # <<< 7530 1727096011.79137: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # <<< 7530 1727096011.79152: stdout chunk (state=3): >>> <<< 7530 1727096011.79224: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 7530 1727096011.79276: stdout chunk (state=3): >>> <<< 7530 1727096011.79284: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7530 1727096011.79303: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 7530 1727096011.79326: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.79339: stdout chunk (state=3): >>> <<< 7530 1727096011.79593: stdout chunk (state=3): >>># zipimport: zlib available <<< 7530 1727096011.79842: stdout chunk (state=3): >>># zipimport: zlib available<<< 7530 1727096011.79995: stdout chunk (state=3): >>> <<< 7530 1727096011.80010: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 7530 1727096011.80035: stdout chunk (state=3): >>># destroy __main__<<< 7530 1727096011.80041: stdout chunk (state=3): >>> <<< 7530 1727096011.80621: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._<<< 7530 1727096011.80703: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 <<< 7530 1727096011.80707: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc <<< 7530 1727096011.80747: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr<<< 7530 1727096011.80751: stdout chunk (state=3): >>> # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 7530 1727096011.80858: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix<<< 7530 1727096011.80886: stdout chunk (state=3): >>> # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random<<< 7530 1727096011.80917: stdout chunk (state=3): >>> # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib<<< 7530 1727096011.80950: stdout chunk (state=3): >>> # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible<<< 7530 1727096011.80986: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit<<< 7530 1727096011.81007: stdout chunk (state=3): >>> # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback<<< 7530 1727096011.81024: stdout chunk (state=3): >>> # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string<<< 7530 1727096011.81028: stdout chunk (state=3): >>> # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array<<< 7530 1727096011.81044: stdout chunk (state=3): >>> # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text<<< 7530 1727096011.81064: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian<<< 7530 1727096011.81093: stdout chunk (state=3): >>> # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections<<< 7530 1727096011.81101: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast<<< 7530 1727096011.81112: stdout chunk (state=3): >>> # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation<<< 7530 1727096011.81129: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4<<< 7530 1727096011.81142: stdout chunk (state=3): >>> # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process<<< 7530 1727096011.81162: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils<<< 7530 1727096011.81340: stdout chunk (state=3): >>> # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 7530 1727096011.81649: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7530 1727096011.81662: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 7530 1727096011.81690: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression <<< 7530 1727096011.81711: stdout chunk (state=3): >>># destroy _lzma <<< 7530 1727096011.81714: stdout chunk (state=3): >>># destroy _blake2 <<< 7530 1727096011.81754: stdout chunk (state=3): >>># destroy binascii # destroy struct <<< 7530 1727096011.81757: stdout chunk (state=3): >>># destroy zlib # destroy bz2 # destroy lzma <<< 7530 1727096011.81760: stdout chunk (state=3): >>># destroy zipfile._path<<< 7530 1727096011.81795: stdout chunk (state=3): >>> <<< 7530 1727096011.81802: stdout chunk (state=3): >>># destroy zipfile<<< 7530 1727096011.81806: stdout chunk (state=3): >>> # destroy pathlib <<< 7530 1727096011.81833: stdout chunk (state=3): >>># destroy zipfile._path.glob # destroy fnmatch<<< 7530 1727096011.81839: stdout chunk (state=3): >>> <<< 7530 1727096011.81857: stdout chunk (state=3): >>># destroy ipaddress <<< 7530 1727096011.81886: stdout chunk (state=3): >>># destroy ntpath<<< 7530 1727096011.81896: stdout chunk (state=3): >>> <<< 7530 1727096011.81919: stdout chunk (state=3): >>># destroy importlib<<< 7530 1727096011.81934: stdout chunk (state=3): >>> # destroy zipimport <<< 7530 1727096011.81948: stdout chunk (state=3): >>># destroy __main__ <<< 7530 1727096011.81958: stdout chunk (state=3): >>># destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux <<< 7530 1727096011.81979: stdout chunk (state=3): >>># destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp<<< 7530 1727096011.82004: stdout chunk (state=3): >>> # destroy encodings # destroy _locale<<< 7530 1727096011.82023: stdout chunk (state=3): >>> # destroy pwd # destroy locale<<< 7530 1727096011.82032: stdout chunk (state=3): >>> <<< 7530 1727096011.82047: stdout chunk (state=3): >>># destroy signal # destroy fcntl # destroy select<<< 7530 1727096011.82057: stdout chunk (state=3): >>> # destroy _signal # destroy _posixsubprocess<<< 7530 1727096011.82065: stdout chunk (state=3): >>> # destroy syslog<<< 7530 1727096011.82095: stdout chunk (state=3): >>> # destroy uuid # destroy selectors<<< 7530 1727096011.82104: stdout chunk (state=3): >>> # destroy errno<<< 7530 1727096011.82124: stdout chunk (state=3): >>> # destroy array<<< 7530 1727096011.82129: stdout chunk (state=3): >>> <<< 7530 1727096011.82169: stdout chunk (state=3): >>># destroy datetime # destroy selinux<<< 7530 1727096011.82176: stdout chunk (state=3): >>> # destroy shutil<<< 7530 1727096011.82206: stdout chunk (state=3): >>> # destroy distro <<< 7530 1727096011.82212: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse<<< 7530 1727096011.82289: stdout chunk (state=3): >>> # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux <<< 7530 1727096011.82307: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian<<< 7530 1727096011.82313: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes<<< 7530 1727096011.82338: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 7530 1727096011.82349: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves<<< 7530 1727096011.82357: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket<<< 7530 1727096011.82378: stdout chunk (state=3): >>> # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 7530 1727096011.82390: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime<<< 7530 1727096011.82406: stdout chunk (state=3): >>> # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 7530 1727096011.82416: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 7530 1727096011.82436: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 7530 1727096011.82439: stdout chunk (state=3): >>> <<< 7530 1727096011.82459: stdout chunk (state=3): >>># cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 7530 1727096011.82467: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 7530 1727096011.82482: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random<<< 7530 1727096011.82508: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 7530 1727096011.82512: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap<<< 7530 1727096011.82514: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants<<< 7530 1727096011.82534: stdout chunk (state=3): >>> # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 7530 1727096011.82539: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 7530 1727096011.82560: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 7530 1727096011.82565: stdout chunk (state=3): >>> # cleanup[3] wiping collections<<< 7530 1727096011.82589: stdout chunk (state=3): >>> # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools<<< 7530 1727096011.82593: stdout chunk (state=3): >>> # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 7530 1727096011.82611: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath<<< 7530 1727096011.82626: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 7530 1727096011.82629: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8<<< 7530 1727096011.82651: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external<<< 7530 1727096011.82659: stdout chunk (state=3): >>> # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 7530 1727096011.82742: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7530 1727096011.83006: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 7530 1727096011.83032: stdout chunk (state=3): >>># destroy _collections <<< 7530 1727096011.83072: stdout chunk (state=3): >>># destroy platform <<< 7530 1727096011.83105: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser<<< 7530 1727096011.83140: stdout chunk (state=3): >>> # destroy tokenize<<< 7530 1727096011.83175: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib <<< 7530 1727096011.83191: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib<<< 7530 1727096011.83236: stdout chunk (state=3): >>> # destroy _typing <<< 7530 1727096011.83241: stdout chunk (state=3): >>># destroy _tokenize <<< 7530 1727096011.83282: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 7530 1727096011.83285: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 7530 1727096011.83340: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules<<< 7530 1727096011.83343: stdout chunk (state=3): >>> # destroy _frozen_importlib<<< 7530 1727096011.83443: stdout chunk (state=3): >>> <<< 7530 1727096011.83512: stdout chunk (state=3): >>># destroy codecs <<< 7530 1727096011.83551: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 7530 1727096011.83619: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref <<< 7530 1727096011.83622: stdout chunk (state=3): >>># destroy collections # destroy threading <<< 7530 1727096011.83651: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 7530 1727096011.83738: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 7530 1727096011.83745: stdout chunk (state=3): >>># destroy _operator<<< 7530 1727096011.83770: stdout chunk (state=3): >>> # destroy _string<<< 7530 1727096011.83797: stdout chunk (state=3): >>> # destroy re # destroy itertools # destroy _abc # destroy _sre<<< 7530 1727096011.83811: stdout chunk (state=3): >>> # destroy posix # destroy _functools<<< 7530 1727096011.83837: stdout chunk (state=3): >>> # destroy builtins # destroy _thread # clear sys.audit hooks <<< 7530 1727096011.84340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096011.84372: stderr chunk (state=3): >>><<< 7530 1727096011.84375: stdout chunk (state=3): >>><<< 7530 1727096011.84444: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f98bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f9cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7abe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7abf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7e3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7e3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7c3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7c1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7a9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f803800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f802420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7c2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f800b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f838830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7a82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f838ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f838b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f838f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f7a6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f8395b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f8392b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f83a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f850680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f851d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f852c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f853290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f852180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f853d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f853440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f83a450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5cfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f8770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5f84d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f86e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f90a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f5f9a90> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5f8950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5cddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5fae70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5f9bb0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f83aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f61f1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f647560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f6a82c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f6aaa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f6a83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f66d2e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f646360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2f5fbda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2d2f646960> # zipimport: found 30 names in '/tmp/ansible_stat_payload_1mxr8usw/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef7b0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef59fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef59160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef78fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2efa2b40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa28d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa21e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa2c30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ef7bd70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2efa3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2efa3a40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efa3f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee0dc40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee0f860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee10260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee11130> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee13e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2f7a6ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee12120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1bec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1a990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1a6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee1ac60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee12630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee63f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee63f80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee65b20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee658e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee680e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee66210> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee6b830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee68200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6c650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6c680> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6cc50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2eef8380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2eef9970> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee6eb10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ee6fec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ee6e750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2eefdaf0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eefe750> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eef9b80> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eefe450> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2eeff980> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2d2ed0a450> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ed05100> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2effec90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2efee960> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ed0a4e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2d2ed007d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 7530 1727096011.84978: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096011.84982: _low_level_execute_command(): starting 7530 1727096011.84984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096011.2998288-7628-25049707243301/ > /dev/null 2>&1 && sleep 0' 7530 1727096011.85120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096011.85124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.85135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096011.85174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096011.85191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096011.85244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096011.87943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096011.87969: stderr chunk (state=3): >>><<< 7530 1727096011.87973: stdout chunk (state=3): >>><<< 7530 1727096011.87993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096011.87999: handler run complete 7530 1727096011.88019: attempt loop complete, returning result 7530 1727096011.88024: _execute() done 7530 1727096011.88027: dumping result to json 7530 1727096011.88031: done dumping result, returning 7530 1727096011.88038: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0afff68d-5257-086b-f4f0-000000000168] 7530 1727096011.88042: sending task result for task 0afff68d-5257-086b-f4f0-000000000168 7530 1727096011.88129: done sending task result for task 0afff68d-5257-086b-f4f0-000000000168 7530 1727096011.88132: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7530 1727096011.88195: no more pending results, returning what we have 7530 1727096011.88198: results queue empty 7530 1727096011.88199: checking for any_errors_fatal 7530 1727096011.88205: done checking for any_errors_fatal 7530 1727096011.88206: checking for max_fail_percentage 7530 1727096011.88208: done checking for max_fail_percentage 7530 1727096011.88208: checking to see if all hosts have failed and the running result is not ok 7530 1727096011.88209: done checking to see if all hosts have failed 7530 1727096011.88210: getting the remaining hosts for this loop 7530 1727096011.88211: done getting the remaining hosts for this loop 7530 1727096011.88215: getting the next task for host managed_node3 7530 1727096011.88220: done getting next task for host managed_node3 7530 1727096011.88223: ^ task is: TASK: Set flag to indicate system is ostree 7530 1727096011.88226: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096011.88229: getting variables 7530 1727096011.88230: in VariableManager get_vars() 7530 1727096011.88258: Calling all_inventory to load vars for managed_node3 7530 1727096011.88261: Calling groups_inventory to load vars for managed_node3 7530 1727096011.88264: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096011.88277: Calling all_plugins_play to load vars for managed_node3 7530 1727096011.88279: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096011.88282: Calling groups_plugins_play to load vars for managed_node3 7530 1727096011.88447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.88572: done with get_vars() 7530 1727096011.88581: done getting variables 7530 1727096011.88656: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:53:31 -0400 (0:00:00.637) 0:00:02.675 ****** 7530 1727096011.88679: entering _queue_task() for managed_node3/set_fact 7530 1727096011.88680: Creating lock for set_fact 7530 1727096011.88910: worker is 1 (out of 1 available) 7530 1727096011.88925: exiting _queue_task() for managed_node3/set_fact 7530 1727096011.88935: done queuing things up, now waiting for results queue to drain 7530 1727096011.88936: waiting for pending results... 7530 1727096011.89086: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 7530 1727096011.89150: in run() - task 0afff68d-5257-086b-f4f0-000000000169 7530 1727096011.89164: variable 'ansible_search_path' from source: unknown 7530 1727096011.89168: variable 'ansible_search_path' from source: unknown 7530 1727096011.89198: calling self._execute() 7530 1727096011.89255: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.89259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.89272: variable 'omit' from source: magic vars 7530 1727096011.89675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096011.89854: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096011.89888: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096011.89913: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096011.89941: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096011.90006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096011.90024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096011.90046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096011.90064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096011.90159: Evaluated conditional (not __network_is_ostree is defined): True 7530 1727096011.90163: variable 'omit' from source: magic vars 7530 1727096011.90194: variable 'omit' from source: magic vars 7530 1727096011.90286: variable '__ostree_booted_stat' from source: set_fact 7530 1727096011.90325: variable 'omit' from source: magic vars 7530 1727096011.90346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096011.90372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096011.90387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096011.90400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096011.90409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096011.90432: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096011.90435: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.90438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.90509: Set connection var ansible_pipelining to False 7530 1727096011.90515: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096011.90522: Set connection var ansible_timeout to 10 7530 1727096011.90528: Set connection var ansible_shell_executable to /bin/sh 7530 1727096011.90531: Set connection var ansible_shell_type to sh 7530 1727096011.90533: Set connection var ansible_connection to ssh 7530 1727096011.90552: variable 'ansible_shell_executable' from source: unknown 7530 1727096011.90555: variable 'ansible_connection' from source: unknown 7530 1727096011.90558: variable 'ansible_module_compression' from source: unknown 7530 1727096011.90560: variable 'ansible_shell_type' from source: unknown 7530 1727096011.90562: variable 'ansible_shell_executable' from source: unknown 7530 1727096011.90564: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.90569: variable 'ansible_pipelining' from source: unknown 7530 1727096011.90572: variable 'ansible_timeout' from source: unknown 7530 1727096011.90583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.90652: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096011.90660: variable 'omit' from source: magic vars 7530 1727096011.90665: starting attempt loop 7530 1727096011.90670: running the handler 7530 1727096011.90679: handler run complete 7530 1727096011.90691: attempt loop complete, returning result 7530 1727096011.90694: _execute() done 7530 1727096011.90696: dumping result to json 7530 1727096011.90698: done dumping result, returning 7530 1727096011.90700: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0afff68d-5257-086b-f4f0-000000000169] 7530 1727096011.90709: sending task result for task 0afff68d-5257-086b-f4f0-000000000169 7530 1727096011.90783: done sending task result for task 0afff68d-5257-086b-f4f0-000000000169 7530 1727096011.90786: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 7530 1727096011.90844: no more pending results, returning what we have 7530 1727096011.90847: results queue empty 7530 1727096011.90848: checking for any_errors_fatal 7530 1727096011.90855: done checking for any_errors_fatal 7530 1727096011.90855: checking for max_fail_percentage 7530 1727096011.90857: done checking for max_fail_percentage 7530 1727096011.90858: checking to see if all hosts have failed and the running result is not ok 7530 1727096011.90859: done checking to see if all hosts have failed 7530 1727096011.90859: getting the remaining hosts for this loop 7530 1727096011.90861: done getting the remaining hosts for this loop 7530 1727096011.90864: getting the next task for host managed_node3 7530 1727096011.90875: done getting next task for host managed_node3 7530 1727096011.90878: ^ task is: TASK: Fix CentOS6 Base repo 7530 1727096011.90881: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096011.90884: getting variables 7530 1727096011.90885: in VariableManager get_vars() 7530 1727096011.90914: Calling all_inventory to load vars for managed_node3 7530 1727096011.90916: Calling groups_inventory to load vars for managed_node3 7530 1727096011.90922: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096011.90931: Calling all_plugins_play to load vars for managed_node3 7530 1727096011.90934: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096011.90943: Calling groups_plugins_play to load vars for managed_node3 7530 1727096011.91116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.91233: done with get_vars() 7530 1727096011.91241: done getting variables 7530 1727096011.91335: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:53:31 -0400 (0:00:00.026) 0:00:02.701 ****** 7530 1727096011.91356: entering _queue_task() for managed_node3/copy 7530 1727096011.91584: worker is 1 (out of 1 available) 7530 1727096011.91596: exiting _queue_task() for managed_node3/copy 7530 1727096011.91607: done queuing things up, now waiting for results queue to drain 7530 1727096011.91609: waiting for pending results... 7530 1727096011.91765: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 7530 1727096011.91857: in run() - task 0afff68d-5257-086b-f4f0-00000000016b 7530 1727096011.91871: variable 'ansible_search_path' from source: unknown 7530 1727096011.91875: variable 'ansible_search_path' from source: unknown 7530 1727096011.91902: calling self._execute() 7530 1727096011.91963: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.91968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.91979: variable 'omit' from source: magic vars 7530 1727096011.92329: variable 'ansible_distribution' from source: facts 7530 1727096011.92347: Evaluated conditional (ansible_distribution == 'CentOS'): True 7530 1727096011.92435: variable 'ansible_distribution_major_version' from source: facts 7530 1727096011.92439: Evaluated conditional (ansible_distribution_major_version == '6'): False 7530 1727096011.92444: when evaluation is False, skipping this task 7530 1727096011.92448: _execute() done 7530 1727096011.92450: dumping result to json 7530 1727096011.92453: done dumping result, returning 7530 1727096011.92458: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0afff68d-5257-086b-f4f0-00000000016b] 7530 1727096011.92463: sending task result for task 0afff68d-5257-086b-f4f0-00000000016b 7530 1727096011.92558: done sending task result for task 0afff68d-5257-086b-f4f0-00000000016b 7530 1727096011.92561: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7530 1727096011.92657: no more pending results, returning what we have 7530 1727096011.92660: results queue empty 7530 1727096011.92660: checking for any_errors_fatal 7530 1727096011.92664: done checking for any_errors_fatal 7530 1727096011.92665: checking for max_fail_percentage 7530 1727096011.92667: done checking for max_fail_percentage 7530 1727096011.92670: checking to see if all hosts have failed and the running result is not ok 7530 1727096011.92670: done checking to see if all hosts have failed 7530 1727096011.92671: getting the remaining hosts for this loop 7530 1727096011.92672: done getting the remaining hosts for this loop 7530 1727096011.92676: getting the next task for host managed_node3 7530 1727096011.92681: done getting next task for host managed_node3 7530 1727096011.92683: ^ task is: TASK: Include the task 'enable_epel.yml' 7530 1727096011.92686: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096011.92689: getting variables 7530 1727096011.92690: in VariableManager get_vars() 7530 1727096011.92715: Calling all_inventory to load vars for managed_node3 7530 1727096011.92720: Calling groups_inventory to load vars for managed_node3 7530 1727096011.92723: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096011.92733: Calling all_plugins_play to load vars for managed_node3 7530 1727096011.92735: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096011.92737: Calling groups_plugins_play to load vars for managed_node3 7530 1727096011.92856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.92998: done with get_vars() 7530 1727096011.93005: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:53:31 -0400 (0:00:00.017) 0:00:02.719 ****** 7530 1727096011.93073: entering _queue_task() for managed_node3/include_tasks 7530 1727096011.93303: worker is 1 (out of 1 available) 7530 1727096011.93317: exiting _queue_task() for managed_node3/include_tasks 7530 1727096011.93330: done queuing things up, now waiting for results queue to drain 7530 1727096011.93332: waiting for pending results... 7530 1727096011.93557: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 7530 1727096011.93790: in run() - task 0afff68d-5257-086b-f4f0-00000000016c 7530 1727096011.93796: variable 'ansible_search_path' from source: unknown 7530 1727096011.93799: variable 'ansible_search_path' from source: unknown 7530 1727096011.93802: calling self._execute() 7530 1727096011.93804: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096011.93807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096011.93810: variable 'omit' from source: magic vars 7530 1727096011.94344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096011.96071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096011.96123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096011.96160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096011.96189: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096011.96211: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096011.96309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096011.96313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096011.96333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096011.96371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096011.96406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096011.96495: variable '__network_is_ostree' from source: set_fact 7530 1727096011.96584: Evaluated conditional (not __network_is_ostree | d(false)): True 7530 1727096011.96587: _execute() done 7530 1727096011.96589: dumping result to json 7530 1727096011.96591: done dumping result, returning 7530 1727096011.96592: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-086b-f4f0-00000000016c] 7530 1727096011.96595: sending task result for task 0afff68d-5257-086b-f4f0-00000000016c 7530 1727096011.96658: done sending task result for task 0afff68d-5257-086b-f4f0-00000000016c 7530 1727096011.96660: WORKER PROCESS EXITING 7530 1727096011.96691: no more pending results, returning what we have 7530 1727096011.96696: in VariableManager get_vars() 7530 1727096011.96732: Calling all_inventory to load vars for managed_node3 7530 1727096011.96735: Calling groups_inventory to load vars for managed_node3 7530 1727096011.96738: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096011.96748: Calling all_plugins_play to load vars for managed_node3 7530 1727096011.96751: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096011.96753: Calling groups_plugins_play to load vars for managed_node3 7530 1727096011.97319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.97514: done with get_vars() 7530 1727096011.97534: variable 'ansible_search_path' from source: unknown 7530 1727096011.97535: variable 'ansible_search_path' from source: unknown 7530 1727096011.97573: we have included files to process 7530 1727096011.97575: generating all_blocks data 7530 1727096011.97576: done generating all_blocks data 7530 1727096011.97582: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7530 1727096011.97584: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7530 1727096011.97586: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7530 1727096011.98565: done processing included file 7530 1727096011.98571: iterating over new_blocks loaded from include file 7530 1727096011.98572: in VariableManager get_vars() 7530 1727096011.98586: done with get_vars() 7530 1727096011.98588: filtering new block on tags 7530 1727096011.98613: done filtering new block on tags 7530 1727096011.98616: in VariableManager get_vars() 7530 1727096011.98634: done with get_vars() 7530 1727096011.98636: filtering new block on tags 7530 1727096011.98649: done filtering new block on tags 7530 1727096011.98651: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 7530 1727096011.98657: extending task lists for all hosts with included blocks 7530 1727096011.98764: done extending task lists 7530 1727096011.98765: done processing included files 7530 1727096011.98766: results queue empty 7530 1727096011.98767: checking for any_errors_fatal 7530 1727096011.98772: done checking for any_errors_fatal 7530 1727096011.98773: checking for max_fail_percentage 7530 1727096011.98774: done checking for max_fail_percentage 7530 1727096011.98774: checking to see if all hosts have failed and the running result is not ok 7530 1727096011.98775: done checking to see if all hosts have failed 7530 1727096011.98776: getting the remaining hosts for this loop 7530 1727096011.98777: done getting the remaining hosts for this loop 7530 1727096011.98779: getting the next task for host managed_node3 7530 1727096011.98783: done getting next task for host managed_node3 7530 1727096011.98785: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 7530 1727096011.98788: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096011.98790: getting variables 7530 1727096011.98791: in VariableManager get_vars() 7530 1727096011.98799: Calling all_inventory to load vars for managed_node3 7530 1727096011.98802: Calling groups_inventory to load vars for managed_node3 7530 1727096011.98804: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096011.98810: Calling all_plugins_play to load vars for managed_node3 7530 1727096011.98818: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096011.98822: Calling groups_plugins_play to load vars for managed_node3 7530 1727096011.98991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096011.99184: done with get_vars() 7530 1727096011.99194: done getting variables 7530 1727096011.99262: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 7530 1727096011.99475: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:53:31 -0400 (0:00:00.064) 0:00:02.783 ****** 7530 1727096011.99529: entering _queue_task() for managed_node3/command 7530 1727096011.99531: Creating lock for command 7530 1727096011.99817: worker is 1 (out of 1 available) 7530 1727096011.99833: exiting _queue_task() for managed_node3/command 7530 1727096011.99844: done queuing things up, now waiting for results queue to drain 7530 1727096011.99846: waiting for pending results... 7530 1727096011.99995: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 7530 1727096012.00079: in run() - task 0afff68d-5257-086b-f4f0-000000000186 7530 1727096012.00088: variable 'ansible_search_path' from source: unknown 7530 1727096012.00092: variable 'ansible_search_path' from source: unknown 7530 1727096012.00123: calling self._execute() 7530 1727096012.00181: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.00190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.00196: variable 'omit' from source: magic vars 7530 1727096012.00478: variable 'ansible_distribution' from source: facts 7530 1727096012.00487: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7530 1727096012.00583: variable 'ansible_distribution_major_version' from source: facts 7530 1727096012.00587: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7530 1727096012.00590: when evaluation is False, skipping this task 7530 1727096012.00592: _execute() done 7530 1727096012.00596: dumping result to json 7530 1727096012.00598: done dumping result, returning 7530 1727096012.00606: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0afff68d-5257-086b-f4f0-000000000186] 7530 1727096012.00613: sending task result for task 0afff68d-5257-086b-f4f0-000000000186 7530 1727096012.00714: done sending task result for task 0afff68d-5257-086b-f4f0-000000000186 7530 1727096012.00721: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7530 1727096012.00779: no more pending results, returning what we have 7530 1727096012.00783: results queue empty 7530 1727096012.00783: checking for any_errors_fatal 7530 1727096012.00784: done checking for any_errors_fatal 7530 1727096012.00785: checking for max_fail_percentage 7530 1727096012.00787: done checking for max_fail_percentage 7530 1727096012.00787: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.00788: done checking to see if all hosts have failed 7530 1727096012.00789: getting the remaining hosts for this loop 7530 1727096012.00791: done getting the remaining hosts for this loop 7530 1727096012.00794: getting the next task for host managed_node3 7530 1727096012.00800: done getting next task for host managed_node3 7530 1727096012.00802: ^ task is: TASK: Install yum-utils package 7530 1727096012.00806: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.00809: getting variables 7530 1727096012.00810: in VariableManager get_vars() 7530 1727096012.00839: Calling all_inventory to load vars for managed_node3 7530 1727096012.00842: Calling groups_inventory to load vars for managed_node3 7530 1727096012.00845: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.00855: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.00857: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.00860: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.00991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.01109: done with get_vars() 7530 1727096012.01118: done getting variables 7530 1727096012.01195: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:53:32 -0400 (0:00:00.016) 0:00:02.800 ****** 7530 1727096012.01217: entering _queue_task() for managed_node3/package 7530 1727096012.01219: Creating lock for package 7530 1727096012.01463: worker is 1 (out of 1 available) 7530 1727096012.01626: exiting _queue_task() for managed_node3/package 7530 1727096012.01636: done queuing things up, now waiting for results queue to drain 7530 1727096012.01638: waiting for pending results... 7530 1727096012.01895: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 7530 1727096012.01900: in run() - task 0afff68d-5257-086b-f4f0-000000000187 7530 1727096012.01903: variable 'ansible_search_path' from source: unknown 7530 1727096012.01906: variable 'ansible_search_path' from source: unknown 7530 1727096012.01924: calling self._execute() 7530 1727096012.02119: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.02131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.02145: variable 'omit' from source: magic vars 7530 1727096012.02554: variable 'ansible_distribution' from source: facts 7530 1727096012.02563: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7530 1727096012.02666: variable 'ansible_distribution_major_version' from source: facts 7530 1727096012.02672: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7530 1727096012.02675: when evaluation is False, skipping this task 7530 1727096012.02678: _execute() done 7530 1727096012.02681: dumping result to json 7530 1727096012.02683: done dumping result, returning 7530 1727096012.02691: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0afff68d-5257-086b-f4f0-000000000187] 7530 1727096012.02695: sending task result for task 0afff68d-5257-086b-f4f0-000000000187 7530 1727096012.02787: done sending task result for task 0afff68d-5257-086b-f4f0-000000000187 7530 1727096012.02790: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7530 1727096012.02901: no more pending results, returning what we have 7530 1727096012.02903: results queue empty 7530 1727096012.02904: checking for any_errors_fatal 7530 1727096012.02908: done checking for any_errors_fatal 7530 1727096012.02908: checking for max_fail_percentage 7530 1727096012.02910: done checking for max_fail_percentage 7530 1727096012.02911: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.02911: done checking to see if all hosts have failed 7530 1727096012.02912: getting the remaining hosts for this loop 7530 1727096012.02913: done getting the remaining hosts for this loop 7530 1727096012.02917: getting the next task for host managed_node3 7530 1727096012.02924: done getting next task for host managed_node3 7530 1727096012.02926: ^ task is: TASK: Enable EPEL 7 7530 1727096012.02929: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.02932: getting variables 7530 1727096012.02934: in VariableManager get_vars() 7530 1727096012.02954: Calling all_inventory to load vars for managed_node3 7530 1727096012.02956: Calling groups_inventory to load vars for managed_node3 7530 1727096012.02959: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.02979: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.02982: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.02986: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.03095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.03217: done with get_vars() 7530 1727096012.03225: done getting variables 7530 1727096012.03266: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:53:32 -0400 (0:00:00.020) 0:00:02.821 ****** 7530 1727096012.03289: entering _queue_task() for managed_node3/command 7530 1727096012.03509: worker is 1 (out of 1 available) 7530 1727096012.03521: exiting _queue_task() for managed_node3/command 7530 1727096012.03531: done queuing things up, now waiting for results queue to drain 7530 1727096012.03533: waiting for pending results... 7530 1727096012.03688: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 7530 1727096012.03764: in run() - task 0afff68d-5257-086b-f4f0-000000000188 7530 1727096012.03776: variable 'ansible_search_path' from source: unknown 7530 1727096012.03779: variable 'ansible_search_path' from source: unknown 7530 1727096012.03808: calling self._execute() 7530 1727096012.03874: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.03882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.03886: variable 'omit' from source: magic vars 7530 1727096012.04177: variable 'ansible_distribution' from source: facts 7530 1727096012.04188: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7530 1727096012.04284: variable 'ansible_distribution_major_version' from source: facts 7530 1727096012.04288: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7530 1727096012.04291: when evaluation is False, skipping this task 7530 1727096012.04294: _execute() done 7530 1727096012.04298: dumping result to json 7530 1727096012.04300: done dumping result, returning 7530 1727096012.04308: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0afff68d-5257-086b-f4f0-000000000188] 7530 1727096012.04312: sending task result for task 0afff68d-5257-086b-f4f0-000000000188 7530 1727096012.04399: done sending task result for task 0afff68d-5257-086b-f4f0-000000000188 7530 1727096012.04402: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7530 1727096012.04458: no more pending results, returning what we have 7530 1727096012.04462: results queue empty 7530 1727096012.04462: checking for any_errors_fatal 7530 1727096012.04469: done checking for any_errors_fatal 7530 1727096012.04470: checking for max_fail_percentage 7530 1727096012.04472: done checking for max_fail_percentage 7530 1727096012.04473: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.04474: done checking to see if all hosts have failed 7530 1727096012.04474: getting the remaining hosts for this loop 7530 1727096012.04476: done getting the remaining hosts for this loop 7530 1727096012.04479: getting the next task for host managed_node3 7530 1727096012.04491: done getting next task for host managed_node3 7530 1727096012.04493: ^ task is: TASK: Enable EPEL 8 7530 1727096012.04497: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.04500: getting variables 7530 1727096012.04502: in VariableManager get_vars() 7530 1727096012.04529: Calling all_inventory to load vars for managed_node3 7530 1727096012.04532: Calling groups_inventory to load vars for managed_node3 7530 1727096012.04534: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.04544: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.04547: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.04549: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.04748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.04997: done with get_vars() 7530 1727096012.05009: done getting variables 7530 1727096012.05080: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:53:32 -0400 (0:00:00.018) 0:00:02.839 ****** 7530 1727096012.05114: entering _queue_task() for managed_node3/command 7530 1727096012.05462: worker is 1 (out of 1 available) 7530 1727096012.05478: exiting _queue_task() for managed_node3/command 7530 1727096012.05492: done queuing things up, now waiting for results queue to drain 7530 1727096012.05494: waiting for pending results... 7530 1727096012.05887: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 7530 1727096012.05893: in run() - task 0afff68d-5257-086b-f4f0-000000000189 7530 1727096012.05896: variable 'ansible_search_path' from source: unknown 7530 1727096012.05899: variable 'ansible_search_path' from source: unknown 7530 1727096012.05943: calling self._execute() 7530 1727096012.06038: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.06051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.06065: variable 'omit' from source: magic vars 7530 1727096012.06477: variable 'ansible_distribution' from source: facts 7530 1727096012.06497: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7530 1727096012.06628: variable 'ansible_distribution_major_version' from source: facts 7530 1727096012.06639: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7530 1727096012.06646: when evaluation is False, skipping this task 7530 1727096012.06652: _execute() done 7530 1727096012.06658: dumping result to json 7530 1727096012.06664: done dumping result, returning 7530 1727096012.06677: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0afff68d-5257-086b-f4f0-000000000189] 7530 1727096012.06685: sending task result for task 0afff68d-5257-086b-f4f0-000000000189 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7530 1727096012.06869: no more pending results, returning what we have 7530 1727096012.06872: results queue empty 7530 1727096012.06873: checking for any_errors_fatal 7530 1727096012.06877: done checking for any_errors_fatal 7530 1727096012.06878: checking for max_fail_percentage 7530 1727096012.06880: done checking for max_fail_percentage 7530 1727096012.06880: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.06882: done checking to see if all hosts have failed 7530 1727096012.06882: getting the remaining hosts for this loop 7530 1727096012.06884: done getting the remaining hosts for this loop 7530 1727096012.06887: getting the next task for host managed_node3 7530 1727096012.06896: done getting next task for host managed_node3 7530 1727096012.06898: ^ task is: TASK: Enable EPEL 6 7530 1727096012.06902: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.06907: getting variables 7530 1727096012.06909: in VariableManager get_vars() 7530 1727096012.06942: Calling all_inventory to load vars for managed_node3 7530 1727096012.06945: Calling groups_inventory to load vars for managed_node3 7530 1727096012.06949: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.06962: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.06965: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.06970: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.07124: done sending task result for task 0afff68d-5257-086b-f4f0-000000000189 7530 1727096012.07127: WORKER PROCESS EXITING 7530 1727096012.07150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.07340: done with get_vars() 7530 1727096012.07352: done getting variables 7530 1727096012.07424: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:53:32 -0400 (0:00:00.023) 0:00:02.862 ****** 7530 1727096012.07452: entering _queue_task() for managed_node3/copy 7530 1727096012.07757: worker is 1 (out of 1 available) 7530 1727096012.07771: exiting _queue_task() for managed_node3/copy 7530 1727096012.07783: done queuing things up, now waiting for results queue to drain 7530 1727096012.07785: waiting for pending results... 7530 1727096012.08058: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 7530 1727096012.08189: in run() - task 0afff68d-5257-086b-f4f0-00000000018b 7530 1727096012.08209: variable 'ansible_search_path' from source: unknown 7530 1727096012.08282: variable 'ansible_search_path' from source: unknown 7530 1727096012.08285: calling self._execute() 7530 1727096012.08357: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.08370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.08396: variable 'omit' from source: magic vars 7530 1727096012.09048: variable 'ansible_distribution' from source: facts 7530 1727096012.09052: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7530 1727096012.09340: variable 'ansible_distribution_major_version' from source: facts 7530 1727096012.09377: Evaluated conditional (ansible_distribution_major_version == '6'): False 7530 1727096012.09385: when evaluation is False, skipping this task 7530 1727096012.09394: _execute() done 7530 1727096012.09454: dumping result to json 7530 1727096012.09462: done dumping result, returning 7530 1727096012.09480: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0afff68d-5257-086b-f4f0-00000000018b] 7530 1727096012.09490: sending task result for task 0afff68d-5257-086b-f4f0-00000000018b 7530 1727096012.09804: done sending task result for task 0afff68d-5257-086b-f4f0-00000000018b 7530 1727096012.09808: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7530 1727096012.09863: no more pending results, returning what we have 7530 1727096012.09871: results queue empty 7530 1727096012.09872: checking for any_errors_fatal 7530 1727096012.09877: done checking for any_errors_fatal 7530 1727096012.09878: checking for max_fail_percentage 7530 1727096012.09880: done checking for max_fail_percentage 7530 1727096012.09881: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.09882: done checking to see if all hosts have failed 7530 1727096012.09883: getting the remaining hosts for this loop 7530 1727096012.09885: done getting the remaining hosts for this loop 7530 1727096012.09889: getting the next task for host managed_node3 7530 1727096012.09898: done getting next task for host managed_node3 7530 1727096012.09901: ^ task is: TASK: Set network provider to 'nm' 7530 1727096012.09924: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.09930: getting variables 7530 1727096012.09932: in VariableManager get_vars() 7530 1727096012.09966: Calling all_inventory to load vars for managed_node3 7530 1727096012.09970: Calling groups_inventory to load vars for managed_node3 7530 1727096012.09974: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.09988: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.09991: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.09994: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.10634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.10874: done with get_vars() 7530 1727096012.10890: done getting variables 7530 1727096012.10955: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:13 Monday 23 September 2024 08:53:32 -0400 (0:00:00.035) 0:00:02.898 ****** 7530 1727096012.10988: entering _queue_task() for managed_node3/set_fact 7530 1727096012.11284: worker is 1 (out of 1 available) 7530 1727096012.11300: exiting _queue_task() for managed_node3/set_fact 7530 1727096012.11312: done queuing things up, now waiting for results queue to drain 7530 1727096012.11313: waiting for pending results... 7530 1727096012.11571: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 7530 1727096012.11728: in run() - task 0afff68d-5257-086b-f4f0-000000000007 7530 1727096012.11734: variable 'ansible_search_path' from source: unknown 7530 1727096012.11737: calling self._execute() 7530 1727096012.11817: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.11834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.11851: variable 'omit' from source: magic vars 7530 1727096012.11972: variable 'omit' from source: magic vars 7530 1727096012.12023: variable 'omit' from source: magic vars 7530 1727096012.12070: variable 'omit' from source: magic vars 7530 1727096012.12162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096012.12175: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096012.12209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096012.12231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096012.12246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096012.12284: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096012.12292: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.12311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.12426: Set connection var ansible_pipelining to False 7530 1727096012.12439: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096012.12450: Set connection var ansible_timeout to 10 7530 1727096012.12466: Set connection var ansible_shell_executable to /bin/sh 7530 1727096012.12484: Set connection var ansible_shell_type to sh 7530 1727096012.12488: Set connection var ansible_connection to ssh 7530 1727096012.12529: variable 'ansible_shell_executable' from source: unknown 7530 1727096012.12532: variable 'ansible_connection' from source: unknown 7530 1727096012.12595: variable 'ansible_module_compression' from source: unknown 7530 1727096012.12598: variable 'ansible_shell_type' from source: unknown 7530 1727096012.12600: variable 'ansible_shell_executable' from source: unknown 7530 1727096012.12603: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.12605: variable 'ansible_pipelining' from source: unknown 7530 1727096012.12607: variable 'ansible_timeout' from source: unknown 7530 1727096012.12609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.12745: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096012.12761: variable 'omit' from source: magic vars 7530 1727096012.12773: starting attempt loop 7530 1727096012.12780: running the handler 7530 1727096012.12796: handler run complete 7530 1727096012.12817: attempt loop complete, returning result 7530 1727096012.12857: _execute() done 7530 1727096012.12860: dumping result to json 7530 1727096012.12862: done dumping result, returning 7530 1727096012.12864: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0afff68d-5257-086b-f4f0-000000000007] 7530 1727096012.12870: sending task result for task 0afff68d-5257-086b-f4f0-000000000007 ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 7530 1727096012.13189: no more pending results, returning what we have 7530 1727096012.13192: results queue empty 7530 1727096012.13193: checking for any_errors_fatal 7530 1727096012.13201: done checking for any_errors_fatal 7530 1727096012.13201: checking for max_fail_percentage 7530 1727096012.13203: done checking for max_fail_percentage 7530 1727096012.13204: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.13205: done checking to see if all hosts have failed 7530 1727096012.13206: getting the remaining hosts for this loop 7530 1727096012.13207: done getting the remaining hosts for this loop 7530 1727096012.13211: getting the next task for host managed_node3 7530 1727096012.13219: done getting next task for host managed_node3 7530 1727096012.13221: ^ task is: TASK: meta (flush_handlers) 7530 1727096012.13223: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.13228: getting variables 7530 1727096012.13229: in VariableManager get_vars() 7530 1727096012.13302: Calling all_inventory to load vars for managed_node3 7530 1727096012.13305: Calling groups_inventory to load vars for managed_node3 7530 1727096012.13309: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.13321: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.13324: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.13327: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.13936: done sending task result for task 0afff68d-5257-086b-f4f0-000000000007 7530 1727096012.13940: WORKER PROCESS EXITING 7530 1727096012.13964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.14351: done with get_vars() 7530 1727096012.14482: done getting variables 7530 1727096012.14905: in VariableManager get_vars() 7530 1727096012.14917: Calling all_inventory to load vars for managed_node3 7530 1727096012.14920: Calling groups_inventory to load vars for managed_node3 7530 1727096012.14922: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.14927: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.14929: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.14932: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.15312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.15862: done with get_vars() 7530 1727096012.15883: done queuing things up, now waiting for results queue to drain 7530 1727096012.15885: results queue empty 7530 1727096012.15886: checking for any_errors_fatal 7530 1727096012.15888: done checking for any_errors_fatal 7530 1727096012.15889: checking for max_fail_percentage 7530 1727096012.15890: done checking for max_fail_percentage 7530 1727096012.15891: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.15892: done checking to see if all hosts have failed 7530 1727096012.15893: getting the remaining hosts for this loop 7530 1727096012.15893: done getting the remaining hosts for this loop 7530 1727096012.15896: getting the next task for host managed_node3 7530 1727096012.15900: done getting next task for host managed_node3 7530 1727096012.15902: ^ task is: TASK: meta (flush_handlers) 7530 1727096012.15904: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.15912: getting variables 7530 1727096012.15913: in VariableManager get_vars() 7530 1727096012.15923: Calling all_inventory to load vars for managed_node3 7530 1727096012.15925: Calling groups_inventory to load vars for managed_node3 7530 1727096012.15927: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.15932: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.15934: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.15938: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.16307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.16718: done with get_vars() 7530 1727096012.16731: done getting variables 7530 1727096012.16784: in VariableManager get_vars() 7530 1727096012.16794: Calling all_inventory to load vars for managed_node3 7530 1727096012.16797: Calling groups_inventory to load vars for managed_node3 7530 1727096012.16799: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.16804: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.16807: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.16810: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.17139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.17361: done with get_vars() 7530 1727096012.17386: done queuing things up, now waiting for results queue to drain 7530 1727096012.17388: results queue empty 7530 1727096012.17389: checking for any_errors_fatal 7530 1727096012.17391: done checking for any_errors_fatal 7530 1727096012.17391: checking for max_fail_percentage 7530 1727096012.17392: done checking for max_fail_percentage 7530 1727096012.17394: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.17395: done checking to see if all hosts have failed 7530 1727096012.17395: getting the remaining hosts for this loop 7530 1727096012.17396: done getting the remaining hosts for this loop 7530 1727096012.17399: getting the next task for host managed_node3 7530 1727096012.17402: done getting next task for host managed_node3 7530 1727096012.17403: ^ task is: None 7530 1727096012.17404: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.17406: done queuing things up, now waiting for results queue to drain 7530 1727096012.17407: results queue empty 7530 1727096012.17407: checking for any_errors_fatal 7530 1727096012.17408: done checking for any_errors_fatal 7530 1727096012.17409: checking for max_fail_percentage 7530 1727096012.17409: done checking for max_fail_percentage 7530 1727096012.17410: checking to see if all hosts have failed and the running result is not ok 7530 1727096012.17411: done checking to see if all hosts have failed 7530 1727096012.17412: getting the next task for host managed_node3 7530 1727096012.17415: done getting next task for host managed_node3 7530 1727096012.17415: ^ task is: None 7530 1727096012.17417: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.17470: in VariableManager get_vars() 7530 1727096012.17514: done with get_vars() 7530 1727096012.17521: in VariableManager get_vars() 7530 1727096012.17540: done with get_vars() 7530 1727096012.17545: variable 'omit' from source: magic vars 7530 1727096012.17579: in VariableManager get_vars() 7530 1727096012.17609: done with get_vars() 7530 1727096012.17631: variable 'omit' from source: magic vars PLAY [Play for testing auto_gateway setting] *********************************** 7530 1727096012.18237: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 7530 1727096012.18277: getting the remaining hosts for this loop 7530 1727096012.18279: done getting the remaining hosts for this loop 7530 1727096012.18281: getting the next task for host managed_node3 7530 1727096012.18284: done getting next task for host managed_node3 7530 1727096012.18286: ^ task is: TASK: Gathering Facts 7530 1727096012.18287: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096012.18289: getting variables 7530 1727096012.18290: in VariableManager get_vars() 7530 1727096012.18308: Calling all_inventory to load vars for managed_node3 7530 1727096012.18310: Calling groups_inventory to load vars for managed_node3 7530 1727096012.18312: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096012.18318: Calling all_plugins_play to load vars for managed_node3 7530 1727096012.18332: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096012.18335: Calling groups_plugins_play to load vars for managed_node3 7530 1727096012.18497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096012.18699: done with get_vars() 7530 1727096012.18709: done getting variables 7530 1727096012.18750: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Monday 23 September 2024 08:53:32 -0400 (0:00:00.077) 0:00:02.976 ****** 7530 1727096012.18777: entering _queue_task() for managed_node3/gather_facts 7530 1727096012.19093: worker is 1 (out of 1 available) 7530 1727096012.19104: exiting _queue_task() for managed_node3/gather_facts 7530 1727096012.19237: done queuing things up, now waiting for results queue to drain 7530 1727096012.19239: waiting for pending results... 7530 1727096012.19400: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7530 1727096012.19565: in run() - task 0afff68d-5257-086b-f4f0-0000000001b1 7530 1727096012.19571: variable 'ansible_search_path' from source: unknown 7530 1727096012.19590: calling self._execute() 7530 1727096012.19776: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.19781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.19784: variable 'omit' from source: magic vars 7530 1727096012.20093: variable 'ansible_distribution_major_version' from source: facts 7530 1727096012.20125: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096012.20134: variable 'omit' from source: magic vars 7530 1727096012.20211: variable 'omit' from source: magic vars 7530 1727096012.20214: variable 'omit' from source: magic vars 7530 1727096012.20256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096012.20297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096012.20429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096012.20461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096012.20480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096012.20515: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096012.20524: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.20552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.20661: Set connection var ansible_pipelining to False 7530 1727096012.20678: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096012.20755: Set connection var ansible_timeout to 10 7530 1727096012.20758: Set connection var ansible_shell_executable to /bin/sh 7530 1727096012.20765: Set connection var ansible_shell_type to sh 7530 1727096012.20771: Set connection var ansible_connection to ssh 7530 1727096012.20773: variable 'ansible_shell_executable' from source: unknown 7530 1727096012.20775: variable 'ansible_connection' from source: unknown 7530 1727096012.20777: variable 'ansible_module_compression' from source: unknown 7530 1727096012.20779: variable 'ansible_shell_type' from source: unknown 7530 1727096012.20781: variable 'ansible_shell_executable' from source: unknown 7530 1727096012.20783: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096012.20785: variable 'ansible_pipelining' from source: unknown 7530 1727096012.20787: variable 'ansible_timeout' from source: unknown 7530 1727096012.20789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096012.21012: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096012.21015: variable 'omit' from source: magic vars 7530 1727096012.21018: starting attempt loop 7530 1727096012.21083: running the handler 7530 1727096012.21091: variable 'ansible_facts' from source: unknown 7530 1727096012.21093: _low_level_execute_command(): starting 7530 1727096012.21095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096012.21973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096012.22233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096012.22283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096012.24002: stdout chunk (state=3): >>>/root <<< 7530 1727096012.24378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096012.24382: stdout chunk (state=3): >>><<< 7530 1727096012.24385: stderr chunk (state=3): >>><<< 7530 1727096012.24388: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096012.24391: _low_level_execute_command(): starting 7530 1727096012.24393: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145 `" && echo ansible-tmp-1727096012.2423375-7681-269719268662145="` echo /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145 `" ) && sleep 0' 7530 1727096012.25396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096012.25410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096012.25430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096012.25449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096012.25474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096012.25485: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096012.25499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096012.25581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096012.25613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096012.25634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096012.25656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096012.25729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096012.27753: stdout chunk (state=3): >>>ansible-tmp-1727096012.2423375-7681-269719268662145=/root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145 <<< 7530 1727096012.27973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096012.27982: stderr chunk (state=3): >>><<< 7530 1727096012.27985: stdout chunk (state=3): >>><<< 7530 1727096012.28063: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096012.2423375-7681-269719268662145=/root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096012.28069: variable 'ansible_module_compression' from source: unknown 7530 1727096012.28072: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7530 1727096012.28133: variable 'ansible_facts' from source: unknown 7530 1727096012.28271: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/AnsiballZ_setup.py 7530 1727096012.28386: Sending initial data 7530 1727096012.28390: Sent initial data (152 bytes) 7530 1727096012.28849: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096012.28853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096012.28855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096012.28912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096012.28916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096012.28964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096012.31082: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096012.31177: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/AnsiballZ_setup.py" <<< 7530 1727096012.31181: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp4t2fcvxl /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/AnsiballZ_setup.py <<< 7530 1727096012.31183: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp4t2fcvxl" to remote "/root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/AnsiballZ_setup.py" <<< 7530 1727096012.32933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096012.32948: stderr chunk (state=3): >>><<< 7530 1727096012.32974: stdout chunk (state=3): >>><<< 7530 1727096012.32994: done transferring module to remote 7530 1727096012.33092: _low_level_execute_command(): starting 7530 1727096012.33096: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/ /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/AnsiballZ_setup.py && sleep 0' 7530 1727096012.33682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096012.33699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096012.33721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096012.33831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096012.33849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096012.33880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096012.33969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096012.36253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096012.36258: stdout chunk (state=3): >>><<< 7530 1727096012.36261: stderr chunk (state=3): >>><<< 7530 1727096012.36364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096012.36370: _low_level_execute_command(): starting 7530 1727096012.36374: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/AnsiballZ_setup.py && sleep 0' 7530 1727096012.36955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096012.36974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096012.36988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096012.37020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096012.37111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096012.37145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096012.37215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.17294: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_loadavg": {"1m": 0.39501953125, "5m": 0.42822265625, "15m": 0.1806640625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3044, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 487, "free": 3044}, "nocache": {"free": 3318, "used": 213}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1",<<< 7530 1727096013.17323: stdout chunk (state=3): >>> "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 155, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815853056, "block_size": 4096, "block_total": 65519099, "block_available": 63919886, "block_used": 1599213, "inode_total": 131070960, "inode_available": 131029227, "inode_used": 41733, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "53", "second": "33", "epoch": "1727096013", "epoch_int": "1727096013", "date": "2024-09-23", "time": "08:53:33", "iso8601_micro": "2024-09-23T12:53:33.129401Z", "iso8601": "2024-09-23T12:53:33Z", "iso8601_basic": "20240923T085333129401", "iso8601_basic_short": "20240923T085333", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offlo<<< 7530 1727096013.17476: stdout chunk (state=3): >>>ad": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7530 1727096013.20598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096013.20602: stdout chunk (state=3): >>><<< 7530 1727096013.20604: stderr chunk (state=3): >>><<< 7530 1727096013.20609: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_loadavg": {"1m": 0.39501953125, "5m": 0.42822265625, "15m": 0.1806640625}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3044, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 487, "free": 3044}, "nocache": {"free": 3318, "used": 213}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 155, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815853056, "block_size": 4096, "block_total": 65519099, "block_available": 63919886, "block_used": 1599213, "inode_total": 131070960, "inode_available": 131029227, "inode_used": 41733, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "53", "second": "33", "epoch": "1727096013", "epoch_int": "1727096013", "date": "2024-09-23", "time": "08:53:33", "iso8601_micro": "2024-09-23T12:53:33.129401Z", "iso8601": "2024-09-23T12:53:33Z", "iso8601_basic": "20240923T085333129401", "iso8601_basic_short": "20240923T085333", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096013.20886: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096013.20916: _low_level_execute_command(): starting 7530 1727096013.20927: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096012.2423375-7681-269719268662145/ > /dev/null 2>&1 && sleep 0' 7530 1727096013.21484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096013.21498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096013.21514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096013.21531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096013.21549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096013.21561: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096013.21579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096013.21642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096013.21687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096013.21702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096013.21716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096013.21758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.24440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096013.24450: stdout chunk (state=3): >>><<< 7530 1727096013.24460: stderr chunk (state=3): >>><<< 7530 1727096013.24483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096013.24496: handler run complete 7530 1727096013.24634: variable 'ansible_facts' from source: unknown 7530 1727096013.24751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.24950: variable 'ansible_facts' from source: unknown 7530 1727096013.25025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.25103: attempt loop complete, returning result 7530 1727096013.25106: _execute() done 7530 1727096013.25109: dumping result to json 7530 1727096013.25129: done dumping result, returning 7530 1727096013.25136: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-086b-f4f0-0000000001b1] 7530 1727096013.25140: sending task result for task 0afff68d-5257-086b-f4f0-0000000001b1 7530 1727096013.25423: done sending task result for task 0afff68d-5257-086b-f4f0-0000000001b1 7530 1727096013.25426: WORKER PROCESS EXITING ok: [managed_node3] 7530 1727096013.25612: no more pending results, returning what we have 7530 1727096013.25614: results queue empty 7530 1727096013.25614: checking for any_errors_fatal 7530 1727096013.25615: done checking for any_errors_fatal 7530 1727096013.25615: checking for max_fail_percentage 7530 1727096013.25616: done checking for max_fail_percentage 7530 1727096013.25617: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.25617: done checking to see if all hosts have failed 7530 1727096013.25618: getting the remaining hosts for this loop 7530 1727096013.25619: done getting the remaining hosts for this loop 7530 1727096013.25622: getting the next task for host managed_node3 7530 1727096013.25626: done getting next task for host managed_node3 7530 1727096013.25627: ^ task is: TASK: meta (flush_handlers) 7530 1727096013.25628: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.25630: getting variables 7530 1727096013.25631: in VariableManager get_vars() 7530 1727096013.25658: Calling all_inventory to load vars for managed_node3 7530 1727096013.25660: Calling groups_inventory to load vars for managed_node3 7530 1727096013.25661: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.25670: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.25672: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.25674: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.25777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.25890: done with get_vars() 7530 1727096013.25897: done getting variables 7530 1727096013.25946: in VariableManager get_vars() 7530 1727096013.25958: Calling all_inventory to load vars for managed_node3 7530 1727096013.25960: Calling groups_inventory to load vars for managed_node3 7530 1727096013.25961: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.25964: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.25965: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.25969: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.26050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.26158: done with get_vars() 7530 1727096013.26168: done queuing things up, now waiting for results queue to drain 7530 1727096013.26174: results queue empty 7530 1727096013.26174: checking for any_errors_fatal 7530 1727096013.26176: done checking for any_errors_fatal 7530 1727096013.26177: checking for max_fail_percentage 7530 1727096013.26177: done checking for max_fail_percentage 7530 1727096013.26178: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.26178: done checking to see if all hosts have failed 7530 1727096013.26179: getting the remaining hosts for this loop 7530 1727096013.26179: done getting the remaining hosts for this loop 7530 1727096013.26181: getting the next task for host managed_node3 7530 1727096013.26184: done getting next task for host managed_node3 7530 1727096013.26185: ^ task is: TASK: Include the task 'show_interfaces.yml' 7530 1727096013.26187: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.26189: getting variables 7530 1727096013.26190: in VariableManager get_vars() 7530 1727096013.26202: Calling all_inventory to load vars for managed_node3 7530 1727096013.26204: Calling groups_inventory to load vars for managed_node3 7530 1727096013.26205: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.26208: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.26209: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.26210: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.26289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.26417: done with get_vars() 7530 1727096013.26423: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:9 Monday 23 September 2024 08:53:33 -0400 (0:00:01.076) 0:00:04.053 ****** 7530 1727096013.26473: entering _queue_task() for managed_node3/include_tasks 7530 1727096013.26692: worker is 1 (out of 1 available) 7530 1727096013.26706: exiting _queue_task() for managed_node3/include_tasks 7530 1727096013.26716: done queuing things up, now waiting for results queue to drain 7530 1727096013.26718: waiting for pending results... 7530 1727096013.27199: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7530 1727096013.27203: in run() - task 0afff68d-5257-086b-f4f0-00000000000b 7530 1727096013.27206: variable 'ansible_search_path' from source: unknown 7530 1727096013.27209: calling self._execute() 7530 1727096013.27210: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.27212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.27214: variable 'omit' from source: magic vars 7530 1727096013.27622: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.27626: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.27628: _execute() done 7530 1727096013.27630: dumping result to json 7530 1727096013.27632: done dumping result, returning 7530 1727096013.27634: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-086b-f4f0-00000000000b] 7530 1727096013.27636: sending task result for task 0afff68d-5257-086b-f4f0-00000000000b 7530 1727096013.27816: done sending task result for task 0afff68d-5257-086b-f4f0-00000000000b 7530 1727096013.27819: WORKER PROCESS EXITING 7530 1727096013.27856: no more pending results, returning what we have 7530 1727096013.27860: in VariableManager get_vars() 7530 1727096013.27914: Calling all_inventory to load vars for managed_node3 7530 1727096013.27917: Calling groups_inventory to load vars for managed_node3 7530 1727096013.27919: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.27929: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.27931: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.27934: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.28068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.28185: done with get_vars() 7530 1727096013.28190: variable 'ansible_search_path' from source: unknown 7530 1727096013.28203: we have included files to process 7530 1727096013.28204: generating all_blocks data 7530 1727096013.28205: done generating all_blocks data 7530 1727096013.28206: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096013.28207: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096013.28209: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096013.28320: in VariableManager get_vars() 7530 1727096013.28337: done with get_vars() 7530 1727096013.28410: done processing included file 7530 1727096013.28411: iterating over new_blocks loaded from include file 7530 1727096013.28412: in VariableManager get_vars() 7530 1727096013.28434: done with get_vars() 7530 1727096013.28435: filtering new block on tags 7530 1727096013.28446: done filtering new block on tags 7530 1727096013.28448: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7530 1727096013.28451: extending task lists for all hosts with included blocks 7530 1727096013.30656: done extending task lists 7530 1727096013.30658: done processing included files 7530 1727096013.30658: results queue empty 7530 1727096013.30659: checking for any_errors_fatal 7530 1727096013.30660: done checking for any_errors_fatal 7530 1727096013.30660: checking for max_fail_percentage 7530 1727096013.30661: done checking for max_fail_percentage 7530 1727096013.30661: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.30662: done checking to see if all hosts have failed 7530 1727096013.30662: getting the remaining hosts for this loop 7530 1727096013.30663: done getting the remaining hosts for this loop 7530 1727096013.30665: getting the next task for host managed_node3 7530 1727096013.30669: done getting next task for host managed_node3 7530 1727096013.30671: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7530 1727096013.30672: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.30674: getting variables 7530 1727096013.30675: in VariableManager get_vars() 7530 1727096013.30692: Calling all_inventory to load vars for managed_node3 7530 1727096013.30695: Calling groups_inventory to load vars for managed_node3 7530 1727096013.30696: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.30702: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.30703: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.30705: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.30947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.31056: done with get_vars() 7530 1727096013.31063: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:53:33 -0400 (0:00:00.046) 0:00:04.099 ****** 7530 1727096013.31116: entering _queue_task() for managed_node3/include_tasks 7530 1727096013.31347: worker is 1 (out of 1 available) 7530 1727096013.31358: exiting _queue_task() for managed_node3/include_tasks 7530 1727096013.31371: done queuing things up, now waiting for results queue to drain 7530 1727096013.31374: waiting for pending results... 7530 1727096013.31529: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7530 1727096013.31602: in run() - task 0afff68d-5257-086b-f4f0-0000000001ca 7530 1727096013.31616: variable 'ansible_search_path' from source: unknown 7530 1727096013.31619: variable 'ansible_search_path' from source: unknown 7530 1727096013.31646: calling self._execute() 7530 1727096013.31708: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.31713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.31725: variable 'omit' from source: magic vars 7530 1727096013.32008: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.32018: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.32028: _execute() done 7530 1727096013.32031: dumping result to json 7530 1727096013.32034: done dumping result, returning 7530 1727096013.32042: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-086b-f4f0-0000000001ca] 7530 1727096013.32044: sending task result for task 0afff68d-5257-086b-f4f0-0000000001ca 7530 1727096013.32132: done sending task result for task 0afff68d-5257-086b-f4f0-0000000001ca 7530 1727096013.32134: WORKER PROCESS EXITING 7530 1727096013.32171: no more pending results, returning what we have 7530 1727096013.32176: in VariableManager get_vars() 7530 1727096013.32231: Calling all_inventory to load vars for managed_node3 7530 1727096013.32234: Calling groups_inventory to load vars for managed_node3 7530 1727096013.32236: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.32249: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.32251: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.32254: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.32403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.32530: done with get_vars() 7530 1727096013.32536: variable 'ansible_search_path' from source: unknown 7530 1727096013.32537: variable 'ansible_search_path' from source: unknown 7530 1727096013.32566: we have included files to process 7530 1727096013.32569: generating all_blocks data 7530 1727096013.32570: done generating all_blocks data 7530 1727096013.32571: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096013.32571: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096013.32573: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096013.32815: done processing included file 7530 1727096013.32817: iterating over new_blocks loaded from include file 7530 1727096013.32820: in VariableManager get_vars() 7530 1727096013.32836: done with get_vars() 7530 1727096013.32837: filtering new block on tags 7530 1727096013.32848: done filtering new block on tags 7530 1727096013.32850: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7530 1727096013.32853: extending task lists for all hosts with included blocks 7530 1727096013.32921: done extending task lists 7530 1727096013.32922: done processing included files 7530 1727096013.32923: results queue empty 7530 1727096013.32923: checking for any_errors_fatal 7530 1727096013.32925: done checking for any_errors_fatal 7530 1727096013.32926: checking for max_fail_percentage 7530 1727096013.32927: done checking for max_fail_percentage 7530 1727096013.32927: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.32928: done checking to see if all hosts have failed 7530 1727096013.32928: getting the remaining hosts for this loop 7530 1727096013.32929: done getting the remaining hosts for this loop 7530 1727096013.32931: getting the next task for host managed_node3 7530 1727096013.32933: done getting next task for host managed_node3 7530 1727096013.32935: ^ task is: TASK: Gather current interface info 7530 1727096013.32937: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.32938: getting variables 7530 1727096013.32939: in VariableManager get_vars() 7530 1727096013.32949: Calling all_inventory to load vars for managed_node3 7530 1727096013.32951: Calling groups_inventory to load vars for managed_node3 7530 1727096013.32952: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.32956: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.32958: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.32959: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.33047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.33165: done with get_vars() 7530 1727096013.33173: done getting variables 7530 1727096013.33200: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:53:33 -0400 (0:00:00.021) 0:00:04.120 ****** 7530 1727096013.33225: entering _queue_task() for managed_node3/command 7530 1727096013.33455: worker is 1 (out of 1 available) 7530 1727096013.33470: exiting _queue_task() for managed_node3/command 7530 1727096013.33483: done queuing things up, now waiting for results queue to drain 7530 1727096013.33485: waiting for pending results... 7530 1727096013.33637: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7530 1727096013.33705: in run() - task 0afff68d-5257-086b-f4f0-000000000389 7530 1727096013.33716: variable 'ansible_search_path' from source: unknown 7530 1727096013.33726: variable 'ansible_search_path' from source: unknown 7530 1727096013.33758: calling self._execute() 7530 1727096013.33834: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.33838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.33842: variable 'omit' from source: magic vars 7530 1727096013.34148: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.34164: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.34169: variable 'omit' from source: magic vars 7530 1727096013.34201: variable 'omit' from source: magic vars 7530 1727096013.34226: variable 'omit' from source: magic vars 7530 1727096013.34262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096013.34294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096013.34310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096013.34324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096013.34333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096013.34356: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096013.34359: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.34361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.34435: Set connection var ansible_pipelining to False 7530 1727096013.34440: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096013.34445: Set connection var ansible_timeout to 10 7530 1727096013.34452: Set connection var ansible_shell_executable to /bin/sh 7530 1727096013.34455: Set connection var ansible_shell_type to sh 7530 1727096013.34458: Set connection var ansible_connection to ssh 7530 1727096013.34478: variable 'ansible_shell_executable' from source: unknown 7530 1727096013.34481: variable 'ansible_connection' from source: unknown 7530 1727096013.34485: variable 'ansible_module_compression' from source: unknown 7530 1727096013.34489: variable 'ansible_shell_type' from source: unknown 7530 1727096013.34492: variable 'ansible_shell_executable' from source: unknown 7530 1727096013.34494: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.34496: variable 'ansible_pipelining' from source: unknown 7530 1727096013.34498: variable 'ansible_timeout' from source: unknown 7530 1727096013.34500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.34601: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096013.34616: variable 'omit' from source: magic vars 7530 1727096013.34622: starting attempt loop 7530 1727096013.34624: running the handler 7530 1727096013.34635: _low_level_execute_command(): starting 7530 1727096013.34641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096013.35159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096013.35165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096013.35170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096013.35223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096013.35227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096013.35233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096013.35275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.37594: stdout chunk (state=3): >>>/root <<< 7530 1727096013.37728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096013.37764: stderr chunk (state=3): >>><<< 7530 1727096013.37771: stdout chunk (state=3): >>><<< 7530 1727096013.37790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096013.37804: _low_level_execute_command(): starting 7530 1727096013.37810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831 `" && echo ansible-tmp-1727096013.3779192-7723-269006671528831="` echo /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831 `" ) && sleep 0' 7530 1727096013.38277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096013.38280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096013.38282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096013.38292: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096013.38297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096013.38333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096013.38345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096013.38393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.40401: stdout chunk (state=3): >>>ansible-tmp-1727096013.3779192-7723-269006671528831=/root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831 <<< 7530 1727096013.40501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096013.40533: stderr chunk (state=3): >>><<< 7530 1727096013.40536: stdout chunk (state=3): >>><<< 7530 1727096013.40553: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096013.3779192-7723-269006671528831=/root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096013.40587: variable 'ansible_module_compression' from source: unknown 7530 1727096013.40632: ANSIBALLZ: Using generic lock for ansible.legacy.command 7530 1727096013.40635: ANSIBALLZ: Acquiring lock 7530 1727096013.40638: ANSIBALLZ: Lock acquired: 139837168144544 7530 1727096013.40640: ANSIBALLZ: Creating module 7530 1727096013.49792: ANSIBALLZ: Writing module into payload 7530 1727096013.49860: ANSIBALLZ: Writing module 7530 1727096013.49879: ANSIBALLZ: Renaming module 7530 1727096013.49885: ANSIBALLZ: Done creating module 7530 1727096013.49900: variable 'ansible_facts' from source: unknown 7530 1727096013.49948: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/AnsiballZ_command.py 7530 1727096013.50056: Sending initial data 7530 1727096013.50060: Sent initial data (154 bytes) 7530 1727096013.50685: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096013.50710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096013.50788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.53119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096013.53158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096013.53191: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpqul_6idl /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/AnsiballZ_command.py <<< 7530 1727096013.53199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/AnsiballZ_command.py" <<< 7530 1727096013.53225: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpqul_6idl" to remote "/root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/AnsiballZ_command.py" <<< 7530 1727096013.53235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/AnsiballZ_command.py" <<< 7530 1727096013.53748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096013.53796: stderr chunk (state=3): >>><<< 7530 1727096013.53800: stdout chunk (state=3): >>><<< 7530 1727096013.53826: done transferring module to remote 7530 1727096013.53835: _low_level_execute_command(): starting 7530 1727096013.53841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/ /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/AnsiballZ_command.py && sleep 0' 7530 1727096013.54428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096013.54460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096013.54553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.57374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096013.57378: stdout chunk (state=3): >>><<< 7530 1727096013.57381: stderr chunk (state=3): >>><<< 7530 1727096013.57481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096013.57485: _low_level_execute_command(): starting 7530 1727096013.57488: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/AnsiballZ_command.py && sleep 0' 7530 1727096013.58092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096013.58169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096013.58227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096013.58250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096013.58280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096013.58369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.78745: stdout chunk (state=3): >>> <<< 7530 1727096013.78749: stdout chunk (state=3): >>>{"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:33.780684", "end": "2024-09-23 08:53:33.785185", "delta": "0:00:00.004501", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 7530 1727096013.78845: stdout chunk (state=3): >>> <<< 7530 1727096013.81526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096013.81532: stderr chunk (state=3): >>><<< 7530 1727096013.81535: stdout chunk (state=3): >>><<< 7530 1727096013.81556: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:33.780684", "end": "2024-09-23 08:53:33.785185", "delta": "0:00:00.004501", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096013.81674: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096013.81678: _low_level_execute_command(): starting 7530 1727096013.81680: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096013.3779192-7723-269006671528831/ > /dev/null 2>&1 && sleep 0' 7530 1727096013.82402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096013.82421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096013.82487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096013.82584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096013.82611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096013.82700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096013.84861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096013.84950: stderr chunk (state=3): >>><<< 7530 1727096013.84954: stdout chunk (state=3): >>><<< 7530 1727096013.85179: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096013.85183: handler run complete 7530 1727096013.85185: Evaluated conditional (False): False 7530 1727096013.85188: attempt loop complete, returning result 7530 1727096013.85190: _execute() done 7530 1727096013.85192: dumping result to json 7530 1727096013.85194: done dumping result, returning 7530 1727096013.85195: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-086b-f4f0-000000000389] 7530 1727096013.85197: sending task result for task 0afff68d-5257-086b-f4f0-000000000389 7530 1727096013.85276: done sending task result for task 0afff68d-5257-086b-f4f0-000000000389 7530 1727096013.85281: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004501", "end": "2024-09-23 08:53:33.785185", "rc": 0, "start": "2024-09-23 08:53:33.780684" } STDOUT: eth0 lo 7530 1727096013.85359: no more pending results, returning what we have 7530 1727096013.85363: results queue empty 7530 1727096013.85364: checking for any_errors_fatal 7530 1727096013.85365: done checking for any_errors_fatal 7530 1727096013.85366: checking for max_fail_percentage 7530 1727096013.85369: done checking for max_fail_percentage 7530 1727096013.85370: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.85371: done checking to see if all hosts have failed 7530 1727096013.85372: getting the remaining hosts for this loop 7530 1727096013.85373: done getting the remaining hosts for this loop 7530 1727096013.85377: getting the next task for host managed_node3 7530 1727096013.85384: done getting next task for host managed_node3 7530 1727096013.85386: ^ task is: TASK: Set current_interfaces 7530 1727096013.85391: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.85397: getting variables 7530 1727096013.85399: in VariableManager get_vars() 7530 1727096013.85450: Calling all_inventory to load vars for managed_node3 7530 1727096013.85454: Calling groups_inventory to load vars for managed_node3 7530 1727096013.85457: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.85480: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.85484: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.85487: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.85705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.85901: done with get_vars() 7530 1727096013.85922: done getting variables 7530 1727096013.85998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:53:33 -0400 (0:00:00.528) 0:00:04.648 ****** 7530 1727096013.86042: entering _queue_task() for managed_node3/set_fact 7530 1727096013.86333: worker is 1 (out of 1 available) 7530 1727096013.86462: exiting _queue_task() for managed_node3/set_fact 7530 1727096013.86476: done queuing things up, now waiting for results queue to drain 7530 1727096013.86478: waiting for pending results... 7530 1727096013.86792: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7530 1727096013.86800: in run() - task 0afff68d-5257-086b-f4f0-00000000038a 7530 1727096013.86803: variable 'ansible_search_path' from source: unknown 7530 1727096013.86811: variable 'ansible_search_path' from source: unknown 7530 1727096013.86853: calling self._execute() 7530 1727096013.86957: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.86970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.86985: variable 'omit' from source: magic vars 7530 1727096013.87345: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.87357: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.87363: variable 'omit' from source: magic vars 7530 1727096013.87395: variable 'omit' from source: magic vars 7530 1727096013.87472: variable '_current_interfaces' from source: set_fact 7530 1727096013.87523: variable 'omit' from source: magic vars 7530 1727096013.87556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096013.87586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096013.87603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096013.87616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096013.87626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096013.87650: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096013.87655: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.87658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.87728: Set connection var ansible_pipelining to False 7530 1727096013.87733: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096013.87739: Set connection var ansible_timeout to 10 7530 1727096013.87746: Set connection var ansible_shell_executable to /bin/sh 7530 1727096013.87749: Set connection var ansible_shell_type to sh 7530 1727096013.87752: Set connection var ansible_connection to ssh 7530 1727096013.87777: variable 'ansible_shell_executable' from source: unknown 7530 1727096013.87781: variable 'ansible_connection' from source: unknown 7530 1727096013.87784: variable 'ansible_module_compression' from source: unknown 7530 1727096013.87786: variable 'ansible_shell_type' from source: unknown 7530 1727096013.87788: variable 'ansible_shell_executable' from source: unknown 7530 1727096013.87790: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.87792: variable 'ansible_pipelining' from source: unknown 7530 1727096013.87795: variable 'ansible_timeout' from source: unknown 7530 1727096013.87797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.87898: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096013.87908: variable 'omit' from source: magic vars 7530 1727096013.87911: starting attempt loop 7530 1727096013.87914: running the handler 7530 1727096013.87927: handler run complete 7530 1727096013.87934: attempt loop complete, returning result 7530 1727096013.87937: _execute() done 7530 1727096013.87939: dumping result to json 7530 1727096013.87942: done dumping result, returning 7530 1727096013.87948: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-086b-f4f0-00000000038a] 7530 1727096013.87952: sending task result for task 0afff68d-5257-086b-f4f0-00000000038a 7530 1727096013.88037: done sending task result for task 0afff68d-5257-086b-f4f0-00000000038a 7530 1727096013.88039: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7530 1727096013.88103: no more pending results, returning what we have 7530 1727096013.88105: results queue empty 7530 1727096013.88106: checking for any_errors_fatal 7530 1727096013.88117: done checking for any_errors_fatal 7530 1727096013.88120: checking for max_fail_percentage 7530 1727096013.88122: done checking for max_fail_percentage 7530 1727096013.88123: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.88124: done checking to see if all hosts have failed 7530 1727096013.88125: getting the remaining hosts for this loop 7530 1727096013.88126: done getting the remaining hosts for this loop 7530 1727096013.88129: getting the next task for host managed_node3 7530 1727096013.88136: done getting next task for host managed_node3 7530 1727096013.88138: ^ task is: TASK: Show current_interfaces 7530 1727096013.88141: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.88145: getting variables 7530 1727096013.88146: in VariableManager get_vars() 7530 1727096013.88198: Calling all_inventory to load vars for managed_node3 7530 1727096013.88201: Calling groups_inventory to load vars for managed_node3 7530 1727096013.88203: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.88212: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.88214: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.88217: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.88342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.88461: done with get_vars() 7530 1727096013.88471: done getting variables 7530 1727096013.88541: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:53:33 -0400 (0:00:00.025) 0:00:04.673 ****** 7530 1727096013.88563: entering _queue_task() for managed_node3/debug 7530 1727096013.88564: Creating lock for debug 7530 1727096013.88786: worker is 1 (out of 1 available) 7530 1727096013.88798: exiting _queue_task() for managed_node3/debug 7530 1727096013.88810: done queuing things up, now waiting for results queue to drain 7530 1727096013.88812: waiting for pending results... 7530 1727096013.88973: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7530 1727096013.89033: in run() - task 0afff68d-5257-086b-f4f0-0000000001cb 7530 1727096013.89054: variable 'ansible_search_path' from source: unknown 7530 1727096013.89057: variable 'ansible_search_path' from source: unknown 7530 1727096013.89082: calling self._execute() 7530 1727096013.89390: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.89394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.89397: variable 'omit' from source: magic vars 7530 1727096013.89703: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.89722: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.89736: variable 'omit' from source: magic vars 7530 1727096013.89797: variable 'omit' from source: magic vars 7530 1727096013.89909: variable 'current_interfaces' from source: set_fact 7530 1727096013.89952: variable 'omit' from source: magic vars 7530 1727096013.89997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096013.90049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096013.90076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096013.90097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096013.90112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096013.90159: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096013.90169: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.90178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.90296: Set connection var ansible_pipelining to False 7530 1727096013.90307: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096013.90316: Set connection var ansible_timeout to 10 7530 1727096013.90329: Set connection var ansible_shell_executable to /bin/sh 7530 1727096013.90336: Set connection var ansible_shell_type to sh 7530 1727096013.90342: Set connection var ansible_connection to ssh 7530 1727096013.90388: variable 'ansible_shell_executable' from source: unknown 7530 1727096013.90396: variable 'ansible_connection' from source: unknown 7530 1727096013.90404: variable 'ansible_module_compression' from source: unknown 7530 1727096013.90439: variable 'ansible_shell_type' from source: unknown 7530 1727096013.90443: variable 'ansible_shell_executable' from source: unknown 7530 1727096013.90445: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.90449: variable 'ansible_pipelining' from source: unknown 7530 1727096013.90451: variable 'ansible_timeout' from source: unknown 7530 1727096013.90452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.90561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096013.90580: variable 'omit' from source: magic vars 7530 1727096013.90587: starting attempt loop 7530 1727096013.90590: running the handler 7530 1727096013.90640: handler run complete 7530 1727096013.90651: attempt loop complete, returning result 7530 1727096013.90654: _execute() done 7530 1727096013.90656: dumping result to json 7530 1727096013.90659: done dumping result, returning 7530 1727096013.90666: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-086b-f4f0-0000000001cb] 7530 1727096013.90674: sending task result for task 0afff68d-5257-086b-f4f0-0000000001cb 7530 1727096013.90758: done sending task result for task 0afff68d-5257-086b-f4f0-0000000001cb 7530 1727096013.90761: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7530 1727096013.90873: no more pending results, returning what we have 7530 1727096013.90876: results queue empty 7530 1727096013.90877: checking for any_errors_fatal 7530 1727096013.90879: done checking for any_errors_fatal 7530 1727096013.90880: checking for max_fail_percentage 7530 1727096013.90881: done checking for max_fail_percentage 7530 1727096013.90882: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.90883: done checking to see if all hosts have failed 7530 1727096013.90884: getting the remaining hosts for this loop 7530 1727096013.90885: done getting the remaining hosts for this loop 7530 1727096013.90888: getting the next task for host managed_node3 7530 1727096013.90894: done getting next task for host managed_node3 7530 1727096013.90896: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7530 1727096013.90898: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.90901: getting variables 7530 1727096013.90902: in VariableManager get_vars() 7530 1727096013.90935: Calling all_inventory to load vars for managed_node3 7530 1727096013.90937: Calling groups_inventory to load vars for managed_node3 7530 1727096013.90939: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.90948: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.90950: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.90953: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.91064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.91187: done with get_vars() 7530 1727096013.91197: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:11 Monday 23 September 2024 08:53:33 -0400 (0:00:00.026) 0:00:04.700 ****** 7530 1727096013.91260: entering _queue_task() for managed_node3/include_tasks 7530 1727096013.91475: worker is 1 (out of 1 available) 7530 1727096013.91488: exiting _queue_task() for managed_node3/include_tasks 7530 1727096013.91501: done queuing things up, now waiting for results queue to drain 7530 1727096013.91502: waiting for pending results... 7530 1727096013.91658: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7530 1727096013.91725: in run() - task 0afff68d-5257-086b-f4f0-00000000000c 7530 1727096013.91739: variable 'ansible_search_path' from source: unknown 7530 1727096013.91769: calling self._execute() 7530 1727096013.91832: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.91848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.91852: variable 'omit' from source: magic vars 7530 1727096013.92126: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.92175: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.92178: _execute() done 7530 1727096013.92181: dumping result to json 7530 1727096013.92189: done dumping result, returning 7530 1727096013.92196: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-086b-f4f0-00000000000c] 7530 1727096013.92198: sending task result for task 0afff68d-5257-086b-f4f0-00000000000c 7530 1727096013.92340: done sending task result for task 0afff68d-5257-086b-f4f0-00000000000c 7530 1727096013.92342: WORKER PROCESS EXITING 7530 1727096013.92374: no more pending results, returning what we have 7530 1727096013.92379: in VariableManager get_vars() 7530 1727096013.92434: Calling all_inventory to load vars for managed_node3 7530 1727096013.92437: Calling groups_inventory to load vars for managed_node3 7530 1727096013.92439: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.92450: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.92452: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.92455: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.92633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.92860: done with get_vars() 7530 1727096013.92871: variable 'ansible_search_path' from source: unknown 7530 1727096013.92888: we have included files to process 7530 1727096013.92889: generating all_blocks data 7530 1727096013.92891: done generating all_blocks data 7530 1727096013.92899: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096013.92901: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096013.92904: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096013.93393: in VariableManager get_vars() 7530 1727096013.93424: done with get_vars() 7530 1727096013.93624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 7530 1727096013.94190: done processing included file 7530 1727096013.94193: iterating over new_blocks loaded from include file 7530 1727096013.94195: in VariableManager get_vars() 7530 1727096013.94221: done with get_vars() 7530 1727096013.94223: filtering new block on tags 7530 1727096013.94255: done filtering new block on tags 7530 1727096013.94258: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7530 1727096013.94264: extending task lists for all hosts with included blocks 7530 1727096013.96738: done extending task lists 7530 1727096013.96740: done processing included files 7530 1727096013.96741: results queue empty 7530 1727096013.96741: checking for any_errors_fatal 7530 1727096013.96745: done checking for any_errors_fatal 7530 1727096013.96746: checking for max_fail_percentage 7530 1727096013.96748: done checking for max_fail_percentage 7530 1727096013.96748: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.96749: done checking to see if all hosts have failed 7530 1727096013.96750: getting the remaining hosts for this loop 7530 1727096013.96752: done getting the remaining hosts for this loop 7530 1727096013.96755: getting the next task for host managed_node3 7530 1727096013.96759: done getting next task for host managed_node3 7530 1727096013.96762: ^ task is: TASK: Ensure state in ["present", "absent"] 7530 1727096013.96764: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.96766: getting variables 7530 1727096013.96769: in VariableManager get_vars() 7530 1727096013.96792: Calling all_inventory to load vars for managed_node3 7530 1727096013.96795: Calling groups_inventory to load vars for managed_node3 7530 1727096013.96797: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.96803: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.96805: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.96808: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.96953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.97158: done with get_vars() 7530 1727096013.97171: done getting variables 7530 1727096013.97242: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 08:53:33 -0400 (0:00:00.060) 0:00:04.761 ****** 7530 1727096013.97272: entering _queue_task() for managed_node3/fail 7530 1727096013.97274: Creating lock for fail 7530 1727096013.97581: worker is 1 (out of 1 available) 7530 1727096013.97593: exiting _queue_task() for managed_node3/fail 7530 1727096013.97606: done queuing things up, now waiting for results queue to drain 7530 1727096013.97608: waiting for pending results... 7530 1727096013.97769: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7530 1727096013.97833: in run() - task 0afff68d-5257-086b-f4f0-0000000003a5 7530 1727096013.97847: variable 'ansible_search_path' from source: unknown 7530 1727096013.97850: variable 'ansible_search_path' from source: unknown 7530 1727096013.97883: calling self._execute() 7530 1727096013.97947: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.97950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.97960: variable 'omit' from source: magic vars 7530 1727096013.98234: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.98245: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.98340: variable 'state' from source: include params 7530 1727096013.98344: Evaluated conditional (state not in ["present", "absent"]): False 7530 1727096013.98347: when evaluation is False, skipping this task 7530 1727096013.98352: _execute() done 7530 1727096013.98354: dumping result to json 7530 1727096013.98357: done dumping result, returning 7530 1727096013.98363: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-086b-f4f0-0000000003a5] 7530 1727096013.98370: sending task result for task 0afff68d-5257-086b-f4f0-0000000003a5 7530 1727096013.98461: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003a5 7530 1727096013.98463: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7530 1727096013.98543: no more pending results, returning what we have 7530 1727096013.98546: results queue empty 7530 1727096013.98547: checking for any_errors_fatal 7530 1727096013.98549: done checking for any_errors_fatal 7530 1727096013.98549: checking for max_fail_percentage 7530 1727096013.98551: done checking for max_fail_percentage 7530 1727096013.98551: checking to see if all hosts have failed and the running result is not ok 7530 1727096013.98552: done checking to see if all hosts have failed 7530 1727096013.98553: getting the remaining hosts for this loop 7530 1727096013.98554: done getting the remaining hosts for this loop 7530 1727096013.98557: getting the next task for host managed_node3 7530 1727096013.98562: done getting next task for host managed_node3 7530 1727096013.98564: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096013.98570: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096013.98573: getting variables 7530 1727096013.98574: in VariableManager get_vars() 7530 1727096013.98616: Calling all_inventory to load vars for managed_node3 7530 1727096013.98621: Calling groups_inventory to load vars for managed_node3 7530 1727096013.98623: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096013.98634: Calling all_plugins_play to load vars for managed_node3 7530 1727096013.98636: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096013.98638: Calling groups_plugins_play to load vars for managed_node3 7530 1727096013.98759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096013.98895: done with get_vars() 7530 1727096013.98903: done getting variables 7530 1727096013.98945: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 08:53:33 -0400 (0:00:00.016) 0:00:04.778 ****** 7530 1727096013.98969: entering _queue_task() for managed_node3/fail 7530 1727096013.99181: worker is 1 (out of 1 available) 7530 1727096013.99195: exiting _queue_task() for managed_node3/fail 7530 1727096013.99208: done queuing things up, now waiting for results queue to drain 7530 1727096013.99210: waiting for pending results... 7530 1727096013.99374: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096013.99447: in run() - task 0afff68d-5257-086b-f4f0-0000000003a6 7530 1727096013.99458: variable 'ansible_search_path' from source: unknown 7530 1727096013.99462: variable 'ansible_search_path' from source: unknown 7530 1727096013.99491: calling self._execute() 7530 1727096013.99559: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096013.99563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096013.99573: variable 'omit' from source: magic vars 7530 1727096013.99848: variable 'ansible_distribution_major_version' from source: facts 7530 1727096013.99858: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096013.99961: variable 'type' from source: play vars 7530 1727096013.99965: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7530 1727096013.99970: when evaluation is False, skipping this task 7530 1727096013.99973: _execute() done 7530 1727096013.99977: dumping result to json 7530 1727096013.99980: done dumping result, returning 7530 1727096013.99994: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-086b-f4f0-0000000003a6] 7530 1727096013.99996: sending task result for task 0afff68d-5257-086b-f4f0-0000000003a6 7530 1727096014.00077: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003a6 7530 1727096014.00080: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7530 1727096014.00142: no more pending results, returning what we have 7530 1727096014.00145: results queue empty 7530 1727096014.00146: checking for any_errors_fatal 7530 1727096014.00152: done checking for any_errors_fatal 7530 1727096014.00153: checking for max_fail_percentage 7530 1727096014.00154: done checking for max_fail_percentage 7530 1727096014.00155: checking to see if all hosts have failed and the running result is not ok 7530 1727096014.00156: done checking to see if all hosts have failed 7530 1727096014.00157: getting the remaining hosts for this loop 7530 1727096014.00158: done getting the remaining hosts for this loop 7530 1727096014.00162: getting the next task for host managed_node3 7530 1727096014.00170: done getting next task for host managed_node3 7530 1727096014.00173: ^ task is: TASK: Include the task 'show_interfaces.yml' 7530 1727096014.00176: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096014.00179: getting variables 7530 1727096014.00180: in VariableManager get_vars() 7530 1727096014.00225: Calling all_inventory to load vars for managed_node3 7530 1727096014.00228: Calling groups_inventory to load vars for managed_node3 7530 1727096014.00230: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.00240: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.00243: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.00245: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.00410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.00534: done with get_vars() 7530 1727096014.00541: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 08:53:34 -0400 (0:00:00.016) 0:00:04.794 ****** 7530 1727096014.00612: entering _queue_task() for managed_node3/include_tasks 7530 1727096014.00837: worker is 1 (out of 1 available) 7530 1727096014.00850: exiting _queue_task() for managed_node3/include_tasks 7530 1727096014.00861: done queuing things up, now waiting for results queue to drain 7530 1727096014.00863: waiting for pending results... 7530 1727096014.01025: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7530 1727096014.01103: in run() - task 0afff68d-5257-086b-f4f0-0000000003a7 7530 1727096014.01115: variable 'ansible_search_path' from source: unknown 7530 1727096014.01119: variable 'ansible_search_path' from source: unknown 7530 1727096014.01149: calling self._execute() 7530 1727096014.01218: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.01224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.01233: variable 'omit' from source: magic vars 7530 1727096014.01511: variable 'ansible_distribution_major_version' from source: facts 7530 1727096014.01528: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096014.01532: _execute() done 7530 1727096014.01534: dumping result to json 7530 1727096014.01537: done dumping result, returning 7530 1727096014.01547: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-086b-f4f0-0000000003a7] 7530 1727096014.01549: sending task result for task 0afff68d-5257-086b-f4f0-0000000003a7 7530 1727096014.01637: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003a7 7530 1727096014.01639: WORKER PROCESS EXITING 7530 1727096014.01673: no more pending results, returning what we have 7530 1727096014.01677: in VariableManager get_vars() 7530 1727096014.01736: Calling all_inventory to load vars for managed_node3 7530 1727096014.01739: Calling groups_inventory to load vars for managed_node3 7530 1727096014.01741: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.01755: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.01758: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.01761: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.01911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.02034: done with get_vars() 7530 1727096014.02040: variable 'ansible_search_path' from source: unknown 7530 1727096014.02041: variable 'ansible_search_path' from source: unknown 7530 1727096014.02066: we have included files to process 7530 1727096014.02069: generating all_blocks data 7530 1727096014.02071: done generating all_blocks data 7530 1727096014.02075: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096014.02076: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096014.02078: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096014.02150: in VariableManager get_vars() 7530 1727096014.02172: done with get_vars() 7530 1727096014.02254: done processing included file 7530 1727096014.02256: iterating over new_blocks loaded from include file 7530 1727096014.02257: in VariableManager get_vars() 7530 1727096014.02274: done with get_vars() 7530 1727096014.02275: filtering new block on tags 7530 1727096014.02289: done filtering new block on tags 7530 1727096014.02291: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7530 1727096014.02295: extending task lists for all hosts with included blocks 7530 1727096014.02560: done extending task lists 7530 1727096014.02561: done processing included files 7530 1727096014.02562: results queue empty 7530 1727096014.02562: checking for any_errors_fatal 7530 1727096014.02564: done checking for any_errors_fatal 7530 1727096014.02565: checking for max_fail_percentage 7530 1727096014.02566: done checking for max_fail_percentage 7530 1727096014.02566: checking to see if all hosts have failed and the running result is not ok 7530 1727096014.02567: done checking to see if all hosts have failed 7530 1727096014.02569: getting the remaining hosts for this loop 7530 1727096014.02569: done getting the remaining hosts for this loop 7530 1727096014.02571: getting the next task for host managed_node3 7530 1727096014.02574: done getting next task for host managed_node3 7530 1727096014.02575: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7530 1727096014.02578: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096014.02580: getting variables 7530 1727096014.02580: in VariableManager get_vars() 7530 1727096014.02592: Calling all_inventory to load vars for managed_node3 7530 1727096014.02594: Calling groups_inventory to load vars for managed_node3 7530 1727096014.02595: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.02600: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.02601: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.02603: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.02692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.02805: done with get_vars() 7530 1727096014.02812: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:53:34 -0400 (0:00:00.022) 0:00:04.817 ****** 7530 1727096014.02873: entering _queue_task() for managed_node3/include_tasks 7530 1727096014.03111: worker is 1 (out of 1 available) 7530 1727096014.03129: exiting _queue_task() for managed_node3/include_tasks 7530 1727096014.03139: done queuing things up, now waiting for results queue to drain 7530 1727096014.03141: waiting for pending results... 7530 1727096014.03306: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7530 1727096014.03385: in run() - task 0afff68d-5257-086b-f4f0-00000000057e 7530 1727096014.03394: variable 'ansible_search_path' from source: unknown 7530 1727096014.03398: variable 'ansible_search_path' from source: unknown 7530 1727096014.03427: calling self._execute() 7530 1727096014.03492: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.03501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.03507: variable 'omit' from source: magic vars 7530 1727096014.03788: variable 'ansible_distribution_major_version' from source: facts 7530 1727096014.03800: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096014.03804: _execute() done 7530 1727096014.03807: dumping result to json 7530 1727096014.03811: done dumping result, returning 7530 1727096014.03825: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-086b-f4f0-00000000057e] 7530 1727096014.03828: sending task result for task 0afff68d-5257-086b-f4f0-00000000057e 7530 1727096014.03916: done sending task result for task 0afff68d-5257-086b-f4f0-00000000057e 7530 1727096014.03922: WORKER PROCESS EXITING 7530 1727096014.03950: no more pending results, returning what we have 7530 1727096014.03954: in VariableManager get_vars() 7530 1727096014.04013: Calling all_inventory to load vars for managed_node3 7530 1727096014.04016: Calling groups_inventory to load vars for managed_node3 7530 1727096014.04020: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.04033: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.04036: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.04039: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.04206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.04326: done with get_vars() 7530 1727096014.04332: variable 'ansible_search_path' from source: unknown 7530 1727096014.04333: variable 'ansible_search_path' from source: unknown 7530 1727096014.04376: we have included files to process 7530 1727096014.04377: generating all_blocks data 7530 1727096014.04379: done generating all_blocks data 7530 1727096014.04379: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096014.04380: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096014.04382: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096014.04558: done processing included file 7530 1727096014.04560: iterating over new_blocks loaded from include file 7530 1727096014.04562: in VariableManager get_vars() 7530 1727096014.04583: done with get_vars() 7530 1727096014.04584: filtering new block on tags 7530 1727096014.04597: done filtering new block on tags 7530 1727096014.04598: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7530 1727096014.04602: extending task lists for all hosts with included blocks 7530 1727096014.04693: done extending task lists 7530 1727096014.04694: done processing included files 7530 1727096014.04695: results queue empty 7530 1727096014.04695: checking for any_errors_fatal 7530 1727096014.04697: done checking for any_errors_fatal 7530 1727096014.04698: checking for max_fail_percentage 7530 1727096014.04698: done checking for max_fail_percentage 7530 1727096014.04699: checking to see if all hosts have failed and the running result is not ok 7530 1727096014.04700: done checking to see if all hosts have failed 7530 1727096014.04700: getting the remaining hosts for this loop 7530 1727096014.04701: done getting the remaining hosts for this loop 7530 1727096014.04702: getting the next task for host managed_node3 7530 1727096014.04705: done getting next task for host managed_node3 7530 1727096014.04707: ^ task is: TASK: Gather current interface info 7530 1727096014.04709: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096014.04710: getting variables 7530 1727096014.04711: in VariableManager get_vars() 7530 1727096014.04724: Calling all_inventory to load vars for managed_node3 7530 1727096014.04725: Calling groups_inventory to load vars for managed_node3 7530 1727096014.04727: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.04731: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.04732: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.04734: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.04824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.04940: done with get_vars() 7530 1727096014.04947: done getting variables 7530 1727096014.04979: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:53:34 -0400 (0:00:00.021) 0:00:04.838 ****** 7530 1727096014.05005: entering _queue_task() for managed_node3/command 7530 1727096014.05241: worker is 1 (out of 1 available) 7530 1727096014.05255: exiting _queue_task() for managed_node3/command 7530 1727096014.05269: done queuing things up, now waiting for results queue to drain 7530 1727096014.05271: waiting for pending results... 7530 1727096014.05442: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7530 1727096014.05529: in run() - task 0afff68d-5257-086b-f4f0-0000000005b5 7530 1727096014.05541: variable 'ansible_search_path' from source: unknown 7530 1727096014.05545: variable 'ansible_search_path' from source: unknown 7530 1727096014.05575: calling self._execute() 7530 1727096014.05642: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.05646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.05655: variable 'omit' from source: magic vars 7530 1727096014.06235: variable 'ansible_distribution_major_version' from source: facts 7530 1727096014.06243: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096014.06249: variable 'omit' from source: magic vars 7530 1727096014.06291: variable 'omit' from source: magic vars 7530 1727096014.06316: variable 'omit' from source: magic vars 7530 1727096014.06352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096014.06384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096014.06400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096014.06413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.06425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.06449: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096014.06451: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.06456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.06531: Set connection var ansible_pipelining to False 7530 1727096014.06535: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096014.06542: Set connection var ansible_timeout to 10 7530 1727096014.06549: Set connection var ansible_shell_executable to /bin/sh 7530 1727096014.06552: Set connection var ansible_shell_type to sh 7530 1727096014.06554: Set connection var ansible_connection to ssh 7530 1727096014.06575: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.06579: variable 'ansible_connection' from source: unknown 7530 1727096014.06582: variable 'ansible_module_compression' from source: unknown 7530 1727096014.06586: variable 'ansible_shell_type' from source: unknown 7530 1727096014.06589: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.06592: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.06594: variable 'ansible_pipelining' from source: unknown 7530 1727096014.06596: variable 'ansible_timeout' from source: unknown 7530 1727096014.06598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.06703: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096014.06723: variable 'omit' from source: magic vars 7530 1727096014.06726: starting attempt loop 7530 1727096014.06729: running the handler 7530 1727096014.06739: _low_level_execute_command(): starting 7530 1727096014.06746: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096014.07280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.07285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.07289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.07338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.07341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.07402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.09858: stdout chunk (state=3): >>>/root <<< 7530 1727096014.10014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.10041: stderr chunk (state=3): >>><<< 7530 1727096014.10045: stdout chunk (state=3): >>><<< 7530 1727096014.10066: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096014.10080: _low_level_execute_command(): starting 7530 1727096014.10086: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402 `" && echo ansible-tmp-1727096014.1006668-7759-197367340012402="` echo /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402 `" ) && sleep 0' 7530 1727096014.10573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.10577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.10580: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.10590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.10624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.10628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.10637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.10693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.12974: stdout chunk (state=3): >>>ansible-tmp-1727096014.1006668-7759-197367340012402=/root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402 <<< 7530 1727096014.13077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.13108: stderr chunk (state=3): >>><<< 7530 1727096014.13111: stdout chunk (state=3): >>><<< 7530 1727096014.13129: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096014.1006668-7759-197367340012402=/root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096014.13159: variable 'ansible_module_compression' from source: unknown 7530 1727096014.13201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096014.13233: variable 'ansible_facts' from source: unknown 7530 1727096014.13295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/AnsiballZ_command.py 7530 1727096014.13402: Sending initial data 7530 1727096014.13406: Sent initial data (154 bytes) 7530 1727096014.13870: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.13874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096014.13876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096014.13878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.13880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.13935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.13938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.13981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.15633: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096014.15660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096014.15693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpsgeutgre /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/AnsiballZ_command.py <<< 7530 1727096014.15701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/AnsiballZ_command.py" <<< 7530 1727096014.15726: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpsgeutgre" to remote "/root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/AnsiballZ_command.py" <<< 7530 1727096014.15733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/AnsiballZ_command.py" <<< 7530 1727096014.16278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.16306: stderr chunk (state=3): >>><<< 7530 1727096014.16310: stdout chunk (state=3): >>><<< 7530 1727096014.16349: done transferring module to remote 7530 1727096014.16359: _low_level_execute_command(): starting 7530 1727096014.16363: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/ /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/AnsiballZ_command.py && sleep 0' 7530 1727096014.16840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.16844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.16847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096014.16849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.16851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.16907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.16910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.16913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.16954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.19331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.19362: stderr chunk (state=3): >>><<< 7530 1727096014.19366: stdout chunk (state=3): >>><<< 7530 1727096014.19382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096014.19385: _low_level_execute_command(): starting 7530 1727096014.19390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/AnsiballZ_command.py && sleep 0' 7530 1727096014.19872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.19876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.19879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096014.19881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.19932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.19936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.19946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.20004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.42903: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:34.421711", "end": "2024-09-23 08:53:34.426743", "delta": "0:00:00.005032", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096014.45429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096014.45457: stderr chunk (state=3): >>><<< 7530 1727096014.45461: stdout chunk (state=3): >>><<< 7530 1727096014.45478: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:34.421711", "end": "2024-09-23 08:53:34.426743", "delta": "0:00:00.005032", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096014.45509: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096014.45516: _low_level_execute_command(): starting 7530 1727096014.45523: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096014.1006668-7759-197367340012402/ > /dev/null 2>&1 && sleep 0' 7530 1727096014.45987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096014.45990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.45995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096014.46004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.46059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.46066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.46071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.46107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.48409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.48438: stderr chunk (state=3): >>><<< 7530 1727096014.48441: stdout chunk (state=3): >>><<< 7530 1727096014.48455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096014.48461: handler run complete 7530 1727096014.48483: Evaluated conditional (False): False 7530 1727096014.48491: attempt loop complete, returning result 7530 1727096014.48494: _execute() done 7530 1727096014.48496: dumping result to json 7530 1727096014.48501: done dumping result, returning 7530 1727096014.48508: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-086b-f4f0-0000000005b5] 7530 1727096014.48512: sending task result for task 0afff68d-5257-086b-f4f0-0000000005b5 7530 1727096014.48610: done sending task result for task 0afff68d-5257-086b-f4f0-0000000005b5 7530 1727096014.48612: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005032", "end": "2024-09-23 08:53:34.426743", "rc": 0, "start": "2024-09-23 08:53:34.421711" } STDOUT: eth0 lo 7530 1727096014.48955: no more pending results, returning what we have 7530 1727096014.48958: results queue empty 7530 1727096014.48958: checking for any_errors_fatal 7530 1727096014.48959: done checking for any_errors_fatal 7530 1727096014.48959: checking for max_fail_percentage 7530 1727096014.48960: done checking for max_fail_percentage 7530 1727096014.48961: checking to see if all hosts have failed and the running result is not ok 7530 1727096014.48962: done checking to see if all hosts have failed 7530 1727096014.48962: getting the remaining hosts for this loop 7530 1727096014.48963: done getting the remaining hosts for this loop 7530 1727096014.48965: getting the next task for host managed_node3 7530 1727096014.48971: done getting next task for host managed_node3 7530 1727096014.48972: ^ task is: TASK: Set current_interfaces 7530 1727096014.48977: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096014.48979: getting variables 7530 1727096014.48979: in VariableManager get_vars() 7530 1727096014.49005: Calling all_inventory to load vars for managed_node3 7530 1727096014.49006: Calling groups_inventory to load vars for managed_node3 7530 1727096014.49008: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.49015: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.49017: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.49020: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.49115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.49233: done with get_vars() 7530 1727096014.49240: done getting variables 7530 1727096014.49286: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:53:34 -0400 (0:00:00.443) 0:00:05.281 ****** 7530 1727096014.49307: entering _queue_task() for managed_node3/set_fact 7530 1727096014.49516: worker is 1 (out of 1 available) 7530 1727096014.49531: exiting _queue_task() for managed_node3/set_fact 7530 1727096014.49543: done queuing things up, now waiting for results queue to drain 7530 1727096014.49545: waiting for pending results... 7530 1727096014.49697: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7530 1727096014.49783: in run() - task 0afff68d-5257-086b-f4f0-0000000005b6 7530 1727096014.49796: variable 'ansible_search_path' from source: unknown 7530 1727096014.49799: variable 'ansible_search_path' from source: unknown 7530 1727096014.49828: calling self._execute() 7530 1727096014.49893: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.49898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.49907: variable 'omit' from source: magic vars 7530 1727096014.50183: variable 'ansible_distribution_major_version' from source: facts 7530 1727096014.50194: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096014.50200: variable 'omit' from source: magic vars 7530 1727096014.50237: variable 'omit' from source: magic vars 7530 1727096014.50311: variable '_current_interfaces' from source: set_fact 7530 1727096014.50358: variable 'omit' from source: magic vars 7530 1727096014.50392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096014.50421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096014.50438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096014.50454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.50463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.50488: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096014.50491: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.50494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.50570: Set connection var ansible_pipelining to False 7530 1727096014.50578: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096014.50584: Set connection var ansible_timeout to 10 7530 1727096014.50592: Set connection var ansible_shell_executable to /bin/sh 7530 1727096014.50596: Set connection var ansible_shell_type to sh 7530 1727096014.50598: Set connection var ansible_connection to ssh 7530 1727096014.50617: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.50620: variable 'ansible_connection' from source: unknown 7530 1727096014.50624: variable 'ansible_module_compression' from source: unknown 7530 1727096014.50627: variable 'ansible_shell_type' from source: unknown 7530 1727096014.50630: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.50632: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.50636: variable 'ansible_pipelining' from source: unknown 7530 1727096014.50638: variable 'ansible_timeout' from source: unknown 7530 1727096014.50644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.50746: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096014.50755: variable 'omit' from source: magic vars 7530 1727096014.50760: starting attempt loop 7530 1727096014.50762: running the handler 7530 1727096014.50777: handler run complete 7530 1727096014.50784: attempt loop complete, returning result 7530 1727096014.50786: _execute() done 7530 1727096014.50789: dumping result to json 7530 1727096014.50791: done dumping result, returning 7530 1727096014.50798: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-086b-f4f0-0000000005b6] 7530 1727096014.50801: sending task result for task 0afff68d-5257-086b-f4f0-0000000005b6 7530 1727096014.50880: done sending task result for task 0afff68d-5257-086b-f4f0-0000000005b6 7530 1727096014.50883: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7530 1727096014.50943: no more pending results, returning what we have 7530 1727096014.50946: results queue empty 7530 1727096014.50946: checking for any_errors_fatal 7530 1727096014.50954: done checking for any_errors_fatal 7530 1727096014.50954: checking for max_fail_percentage 7530 1727096014.50956: done checking for max_fail_percentage 7530 1727096014.50956: checking to see if all hosts have failed and the running result is not ok 7530 1727096014.50957: done checking to see if all hosts have failed 7530 1727096014.50959: getting the remaining hosts for this loop 7530 1727096014.50960: done getting the remaining hosts for this loop 7530 1727096014.50964: getting the next task for host managed_node3 7530 1727096014.50974: done getting next task for host managed_node3 7530 1727096014.50977: ^ task is: TASK: Show current_interfaces 7530 1727096014.50982: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096014.50985: getting variables 7530 1727096014.50986: in VariableManager get_vars() 7530 1727096014.51032: Calling all_inventory to load vars for managed_node3 7530 1727096014.51035: Calling groups_inventory to load vars for managed_node3 7530 1727096014.51037: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.51046: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.51048: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.51051: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.51187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.51339: done with get_vars() 7530 1727096014.51355: done getting variables 7530 1727096014.51410: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:53:34 -0400 (0:00:00.021) 0:00:05.302 ****** 7530 1727096014.51435: entering _queue_task() for managed_node3/debug 7530 1727096014.51650: worker is 1 (out of 1 available) 7530 1727096014.51662: exiting _queue_task() for managed_node3/debug 7530 1727096014.51676: done queuing things up, now waiting for results queue to drain 7530 1727096014.51678: waiting for pending results... 7530 1727096014.51841: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7530 1727096014.51920: in run() - task 0afff68d-5257-086b-f4f0-00000000057f 7530 1727096014.51935: variable 'ansible_search_path' from source: unknown 7530 1727096014.51938: variable 'ansible_search_path' from source: unknown 7530 1727096014.51967: calling self._execute() 7530 1727096014.52044: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.52048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.52056: variable 'omit' from source: magic vars 7530 1727096014.52343: variable 'ansible_distribution_major_version' from source: facts 7530 1727096014.52361: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096014.52369: variable 'omit' from source: magic vars 7530 1727096014.52399: variable 'omit' from source: magic vars 7530 1727096014.52480: variable 'current_interfaces' from source: set_fact 7530 1727096014.52502: variable 'omit' from source: magic vars 7530 1727096014.52537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096014.52566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096014.52589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096014.52604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.52614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.52639: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096014.52642: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.52645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.52723: Set connection var ansible_pipelining to False 7530 1727096014.52727: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096014.52730: Set connection var ansible_timeout to 10 7530 1727096014.52738: Set connection var ansible_shell_executable to /bin/sh 7530 1727096014.52741: Set connection var ansible_shell_type to sh 7530 1727096014.52744: Set connection var ansible_connection to ssh 7530 1727096014.52763: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.52766: variable 'ansible_connection' from source: unknown 7530 1727096014.52771: variable 'ansible_module_compression' from source: unknown 7530 1727096014.52773: variable 'ansible_shell_type' from source: unknown 7530 1727096014.52778: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.52781: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.52783: variable 'ansible_pipelining' from source: unknown 7530 1727096014.52785: variable 'ansible_timeout' from source: unknown 7530 1727096014.52787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.52973: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096014.52977: variable 'omit' from source: magic vars 7530 1727096014.52979: starting attempt loop 7530 1727096014.52981: running the handler 7530 1727096014.53005: handler run complete 7530 1727096014.53173: attempt loop complete, returning result 7530 1727096014.53177: _execute() done 7530 1727096014.53179: dumping result to json 7530 1727096014.53181: done dumping result, returning 7530 1727096014.53183: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-086b-f4f0-00000000057f] 7530 1727096014.53185: sending task result for task 0afff68d-5257-086b-f4f0-00000000057f 7530 1727096014.53256: done sending task result for task 0afff68d-5257-086b-f4f0-00000000057f 7530 1727096014.53259: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7530 1727096014.53373: no more pending results, returning what we have 7530 1727096014.53376: results queue empty 7530 1727096014.53377: checking for any_errors_fatal 7530 1727096014.53381: done checking for any_errors_fatal 7530 1727096014.53382: checking for max_fail_percentage 7530 1727096014.53383: done checking for max_fail_percentage 7530 1727096014.53385: checking to see if all hosts have failed and the running result is not ok 7530 1727096014.53386: done checking to see if all hosts have failed 7530 1727096014.53386: getting the remaining hosts for this loop 7530 1727096014.53387: done getting the remaining hosts for this loop 7530 1727096014.53391: getting the next task for host managed_node3 7530 1727096014.53399: done getting next task for host managed_node3 7530 1727096014.53402: ^ task is: TASK: Install iproute 7530 1727096014.53404: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096014.53408: getting variables 7530 1727096014.53438: in VariableManager get_vars() 7530 1727096014.53487: Calling all_inventory to load vars for managed_node3 7530 1727096014.53490: Calling groups_inventory to load vars for managed_node3 7530 1727096014.53492: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096014.53503: Calling all_plugins_play to load vars for managed_node3 7530 1727096014.53506: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096014.53509: Calling groups_plugins_play to load vars for managed_node3 7530 1727096014.53685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096014.53896: done with get_vars() 7530 1727096014.53909: done getting variables 7530 1727096014.53970: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 08:53:34 -0400 (0:00:00.025) 0:00:05.328 ****** 7530 1727096014.54002: entering _queue_task() for managed_node3/package 7530 1727096014.54304: worker is 1 (out of 1 available) 7530 1727096014.54317: exiting _queue_task() for managed_node3/package 7530 1727096014.54331: done queuing things up, now waiting for results queue to drain 7530 1727096014.54333: waiting for pending results... 7530 1727096014.54525: running TaskExecutor() for managed_node3/TASK: Install iproute 7530 1727096014.54599: in run() - task 0afff68d-5257-086b-f4f0-0000000003a8 7530 1727096014.54611: variable 'ansible_search_path' from source: unknown 7530 1727096014.54615: variable 'ansible_search_path' from source: unknown 7530 1727096014.54650: calling self._execute() 7530 1727096014.54722: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.54726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.54735: variable 'omit' from source: magic vars 7530 1727096014.55017: variable 'ansible_distribution_major_version' from source: facts 7530 1727096014.55029: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096014.55034: variable 'omit' from source: magic vars 7530 1727096014.55062: variable 'omit' from source: magic vars 7530 1727096014.55203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096014.57175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096014.57179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096014.57181: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096014.57193: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096014.57225: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096014.57337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096014.57385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096014.57403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096014.57430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096014.57441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096014.57525: variable '__network_is_ostree' from source: set_fact 7530 1727096014.57528: variable 'omit' from source: magic vars 7530 1727096014.57554: variable 'omit' from source: magic vars 7530 1727096014.57584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096014.57602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096014.57616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096014.57630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.57638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096014.57662: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096014.57665: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.57670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.57740: Set connection var ansible_pipelining to False 7530 1727096014.57745: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096014.57750: Set connection var ansible_timeout to 10 7530 1727096014.57758: Set connection var ansible_shell_executable to /bin/sh 7530 1727096014.57761: Set connection var ansible_shell_type to sh 7530 1727096014.57763: Set connection var ansible_connection to ssh 7530 1727096014.57788: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.57791: variable 'ansible_connection' from source: unknown 7530 1727096014.57796: variable 'ansible_module_compression' from source: unknown 7530 1727096014.57799: variable 'ansible_shell_type' from source: unknown 7530 1727096014.57801: variable 'ansible_shell_executable' from source: unknown 7530 1727096014.57803: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096014.57805: variable 'ansible_pipelining' from source: unknown 7530 1727096014.57807: variable 'ansible_timeout' from source: unknown 7530 1727096014.57809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096014.57881: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096014.57894: variable 'omit' from source: magic vars 7530 1727096014.57897: starting attempt loop 7530 1727096014.57900: running the handler 7530 1727096014.57902: variable 'ansible_facts' from source: unknown 7530 1727096014.57905: variable 'ansible_facts' from source: unknown 7530 1727096014.57936: _low_level_execute_command(): starting 7530 1727096014.57942: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096014.58471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.58476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.58480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.58532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.58536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.58538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.58598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.60806: stdout chunk (state=3): >>>/root <<< 7530 1727096014.60901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.60940: stderr chunk (state=3): >>><<< 7530 1727096014.60942: stdout chunk (state=3): >>><<< 7530 1727096014.60956: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096014.60997: _low_level_execute_command(): starting 7530 1727096014.61001: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518 `" && echo ansible-tmp-1727096014.6096988-7778-217808086528518="` echo /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518 `" ) && sleep 0' 7530 1727096014.61472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.61476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.61479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.61481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.61537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.61540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.61548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.61585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.63620: stdout chunk (state=3): >>>ansible-tmp-1727096014.6096988-7778-217808086528518=/root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518 <<< 7530 1727096014.63729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.63762: stderr chunk (state=3): >>><<< 7530 1727096014.63765: stdout chunk (state=3): >>><<< 7530 1727096014.63783: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096014.6096988-7778-217808086528518=/root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096014.63812: variable 'ansible_module_compression' from source: unknown 7530 1727096014.63871: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 7530 1727096014.63875: ANSIBALLZ: Acquiring lock 7530 1727096014.63877: ANSIBALLZ: Lock acquired: 139837168144544 7530 1727096014.63879: ANSIBALLZ: Creating module 7530 1727096014.74860: ANSIBALLZ: Writing module into payload 7530 1727096014.74995: ANSIBALLZ: Writing module 7530 1727096014.75014: ANSIBALLZ: Renaming module 7530 1727096014.75027: ANSIBALLZ: Done creating module 7530 1727096014.75043: variable 'ansible_facts' from source: unknown 7530 1727096014.75105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/AnsiballZ_dnf.py 7530 1727096014.75215: Sending initial data 7530 1727096014.75222: Sent initial data (150 bytes) 7530 1727096014.75684: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.75687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.75690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096014.75692: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.75694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.75754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.75758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.75762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.75803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.77446: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096014.77470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096014.77499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpkj31gspb /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/AnsiballZ_dnf.py <<< 7530 1727096014.77506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/AnsiballZ_dnf.py" <<< 7530 1727096014.77530: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpkj31gspb" to remote "/root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/AnsiballZ_dnf.py" <<< 7530 1727096014.77537: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/AnsiballZ_dnf.py" <<< 7530 1727096014.78165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.78215: stderr chunk (state=3): >>><<< 7530 1727096014.78219: stdout chunk (state=3): >>><<< 7530 1727096014.78251: done transferring module to remote 7530 1727096014.78260: _low_level_execute_command(): starting 7530 1727096014.78264: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/ /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/AnsiballZ_dnf.py && sleep 0' 7530 1727096014.78721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096014.78725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.78737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.78795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.78798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.78801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.78843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096014.80711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096014.80740: stderr chunk (state=3): >>><<< 7530 1727096014.80743: stdout chunk (state=3): >>><<< 7530 1727096014.80756: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096014.80759: _low_level_execute_command(): starting 7530 1727096014.80764: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/AnsiballZ_dnf.py && sleep 0' 7530 1727096014.81236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096014.81240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.81254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096014.81309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096014.81312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096014.81315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096014.81365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.06332: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7530 1727096018.11415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096018.11422: stdout chunk (state=3): >>><<< 7530 1727096018.11424: stderr chunk (state=3): >>><<< 7530 1727096018.11574: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096018.11584: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096018.11587: _low_level_execute_command(): starting 7530 1727096018.11589: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096014.6096988-7778-217808086528518/ > /dev/null 2>&1 && sleep 0' 7530 1727096018.12069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.12083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.12094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.12140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.12158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.12198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.14060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.14109: stderr chunk (state=3): >>><<< 7530 1727096018.14112: stdout chunk (state=3): >>><<< 7530 1727096018.14176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.14179: handler run complete 7530 1727096018.14327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096018.14674: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096018.14679: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096018.14681: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096018.14684: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096018.14686: variable '__install_status' from source: unknown 7530 1727096018.14692: Evaluated conditional (__install_status is success): True 7530 1727096018.14699: attempt loop complete, returning result 7530 1727096018.14705: _execute() done 7530 1727096018.14710: dumping result to json 7530 1727096018.14718: done dumping result, returning 7530 1727096018.14730: done running TaskExecutor() for managed_node3/TASK: Install iproute [0afff68d-5257-086b-f4f0-0000000003a8] 7530 1727096018.14737: sending task result for task 0afff68d-5257-086b-f4f0-0000000003a8 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7530 1727096018.15014: no more pending results, returning what we have 7530 1727096018.15017: results queue empty 7530 1727096018.15018: checking for any_errors_fatal 7530 1727096018.15024: done checking for any_errors_fatal 7530 1727096018.15025: checking for max_fail_percentage 7530 1727096018.15026: done checking for max_fail_percentage 7530 1727096018.15027: checking to see if all hosts have failed and the running result is not ok 7530 1727096018.15028: done checking to see if all hosts have failed 7530 1727096018.15029: getting the remaining hosts for this loop 7530 1727096018.15030: done getting the remaining hosts for this loop 7530 1727096018.15033: getting the next task for host managed_node3 7530 1727096018.15039: done getting next task for host managed_node3 7530 1727096018.15042: ^ task is: TASK: Create veth interface {{ interface }} 7530 1727096018.15044: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096018.15048: getting variables 7530 1727096018.15049: in VariableManager get_vars() 7530 1727096018.15295: Calling all_inventory to load vars for managed_node3 7530 1727096018.15298: Calling groups_inventory to load vars for managed_node3 7530 1727096018.15300: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096018.15311: Calling all_plugins_play to load vars for managed_node3 7530 1727096018.15313: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096018.15361: Calling groups_plugins_play to load vars for managed_node3 7530 1727096018.15553: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003a8 7530 1727096018.15556: WORKER PROCESS EXITING 7530 1727096018.15566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096018.15692: done with get_vars() 7530 1727096018.15701: done getting variables 7530 1727096018.15743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096018.15838: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 08:53:38 -0400 (0:00:03.618) 0:00:08.947 ****** 7530 1727096018.15874: entering _queue_task() for managed_node3/command 7530 1727096018.16092: worker is 1 (out of 1 available) 7530 1727096018.16106: exiting _queue_task() for managed_node3/command 7530 1727096018.16117: done queuing things up, now waiting for results queue to drain 7530 1727096018.16119: waiting for pending results... 7530 1727096018.16279: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7530 1727096018.16357: in run() - task 0afff68d-5257-086b-f4f0-0000000003a9 7530 1727096018.16370: variable 'ansible_search_path' from source: unknown 7530 1727096018.16375: variable 'ansible_search_path' from source: unknown 7530 1727096018.16594: variable 'interface' from source: play vars 7530 1727096018.16656: variable 'interface' from source: play vars 7530 1727096018.16711: variable 'interface' from source: play vars 7530 1727096018.16828: Loaded config def from plugin (lookup/items) 7530 1727096018.16832: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7530 1727096018.16851: variable 'omit' from source: magic vars 7530 1727096018.16948: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.16956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.16964: variable 'omit' from source: magic vars 7530 1727096018.17134: variable 'ansible_distribution_major_version' from source: facts 7530 1727096018.17142: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096018.17300: variable 'type' from source: play vars 7530 1727096018.17304: variable 'state' from source: include params 7530 1727096018.17307: variable 'interface' from source: play vars 7530 1727096018.17309: variable 'current_interfaces' from source: set_fact 7530 1727096018.17312: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7530 1727096018.17314: variable 'omit' from source: magic vars 7530 1727096018.17473: variable 'omit' from source: magic vars 7530 1727096018.17476: variable 'item' from source: unknown 7530 1727096018.17479: variable 'item' from source: unknown 7530 1727096018.17515: variable 'omit' from source: magic vars 7530 1727096018.17557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096018.17597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096018.17626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096018.17650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096018.17669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096018.17707: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096018.17716: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.17726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.17829: Set connection var ansible_pipelining to False 7530 1727096018.17840: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096018.17849: Set connection var ansible_timeout to 10 7530 1727096018.17861: Set connection var ansible_shell_executable to /bin/sh 7530 1727096018.17870: Set connection var ansible_shell_type to sh 7530 1727096018.17877: Set connection var ansible_connection to ssh 7530 1727096018.17905: variable 'ansible_shell_executable' from source: unknown 7530 1727096018.17912: variable 'ansible_connection' from source: unknown 7530 1727096018.18072: variable 'ansible_module_compression' from source: unknown 7530 1727096018.18075: variable 'ansible_shell_type' from source: unknown 7530 1727096018.18077: variable 'ansible_shell_executable' from source: unknown 7530 1727096018.18079: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.18081: variable 'ansible_pipelining' from source: unknown 7530 1727096018.18083: variable 'ansible_timeout' from source: unknown 7530 1727096018.18086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.18089: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096018.18115: variable 'omit' from source: magic vars 7530 1727096018.18130: starting attempt loop 7530 1727096018.18137: running the handler 7530 1727096018.18157: _low_level_execute_command(): starting 7530 1727096018.18172: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096018.18763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.18785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.18828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.18848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.18888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.20597: stdout chunk (state=3): >>>/root <<< 7530 1727096018.20688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.20723: stderr chunk (state=3): >>><<< 7530 1727096018.20727: stdout chunk (state=3): >>><<< 7530 1727096018.20747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.20760: _low_level_execute_command(): starting 7530 1727096018.20766: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267 `" && echo ansible-tmp-1727096018.2074816-7937-4761098669267="` echo /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267 `" ) && sleep 0' 7530 1727096018.21253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.21259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.21262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.21265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096018.21268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.21312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.21315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.21317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.21362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.23360: stdout chunk (state=3): >>>ansible-tmp-1727096018.2074816-7937-4761098669267=/root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267 <<< 7530 1727096018.23459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.23492: stderr chunk (state=3): >>><<< 7530 1727096018.23497: stdout chunk (state=3): >>><<< 7530 1727096018.23521: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096018.2074816-7937-4761098669267=/root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.23547: variable 'ansible_module_compression' from source: unknown 7530 1727096018.23591: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096018.23625: variable 'ansible_facts' from source: unknown 7530 1727096018.23680: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/AnsiballZ_command.py 7530 1727096018.23790: Sending initial data 7530 1727096018.23794: Sent initial data (152 bytes) 7530 1727096018.24257: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.24263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.24265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.24269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096018.24271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.24323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.24326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.24332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.24366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.26016: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096018.26056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096018.26090: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp918yhpc7 /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/AnsiballZ_command.py <<< 7530 1727096018.26093: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/AnsiballZ_command.py" <<< 7530 1727096018.26127: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp918yhpc7" to remote "/root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/AnsiballZ_command.py" <<< 7530 1727096018.26130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/AnsiballZ_command.py" <<< 7530 1727096018.26638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.26687: stderr chunk (state=3): >>><<< 7530 1727096018.26690: stdout chunk (state=3): >>><<< 7530 1727096018.26735: done transferring module to remote 7530 1727096018.26743: _low_level_execute_command(): starting 7530 1727096018.26748: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/ /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/AnsiballZ_command.py && sleep 0' 7530 1727096018.27210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.27213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096018.27222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096018.27224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096018.27227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.27275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.27283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.27285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.27315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.29151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.29181: stderr chunk (state=3): >>><<< 7530 1727096018.29185: stdout chunk (state=3): >>><<< 7530 1727096018.29201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.29203: _low_level_execute_command(): starting 7530 1727096018.29209: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/AnsiballZ_command.py && sleep 0' 7530 1727096018.29675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.29679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.29681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.29683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096018.29685: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.29738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.29741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.29743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.29792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.52188: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-23 08:53:38.454700", "end": "2024-09-23 08:53:38.518848", "delta": "0:00:00.064148", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096018.54786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096018.54814: stderr chunk (state=3): >>><<< 7530 1727096018.54817: stdout chunk (state=3): >>><<< 7530 1727096018.54837: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-23 08:53:38.454700", "end": "2024-09-23 08:53:38.518848", "delta": "0:00:00.064148", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096018.54866: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096018.54878: _low_level_execute_command(): starting 7530 1727096018.54883: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096018.2074816-7937-4761098669267/ > /dev/null 2>&1 && sleep 0' 7530 1727096018.55344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.55347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.55353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096018.55355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.55410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.55413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.55420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.55459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.60424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.60453: stderr chunk (state=3): >>><<< 7530 1727096018.60457: stdout chunk (state=3): >>><<< 7530 1727096018.60471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.60479: handler run complete 7530 1727096018.60496: Evaluated conditional (False): False 7530 1727096018.60505: attempt loop complete, returning result 7530 1727096018.60523: variable 'item' from source: unknown 7530 1727096018.60590: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.064148", "end": "2024-09-23 08:53:38.518848", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-23 08:53:38.454700" } 7530 1727096018.60755: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.60758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.60760: variable 'omit' from source: magic vars 7530 1727096018.60880: variable 'ansible_distribution_major_version' from source: facts 7530 1727096018.60883: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096018.61007: variable 'type' from source: play vars 7530 1727096018.61011: variable 'state' from source: include params 7530 1727096018.61013: variable 'interface' from source: play vars 7530 1727096018.61018: variable 'current_interfaces' from source: set_fact 7530 1727096018.61026: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7530 1727096018.61030: variable 'omit' from source: magic vars 7530 1727096018.61042: variable 'omit' from source: magic vars 7530 1727096018.61069: variable 'item' from source: unknown 7530 1727096018.61122: variable 'item' from source: unknown 7530 1727096018.61136: variable 'omit' from source: magic vars 7530 1727096018.61153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096018.61160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096018.61166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096018.61180: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096018.61183: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.61185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.61242: Set connection var ansible_pipelining to False 7530 1727096018.61245: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096018.61251: Set connection var ansible_timeout to 10 7530 1727096018.61258: Set connection var ansible_shell_executable to /bin/sh 7530 1727096018.61260: Set connection var ansible_shell_type to sh 7530 1727096018.61263: Set connection var ansible_connection to ssh 7530 1727096018.61281: variable 'ansible_shell_executable' from source: unknown 7530 1727096018.61283: variable 'ansible_connection' from source: unknown 7530 1727096018.61286: variable 'ansible_module_compression' from source: unknown 7530 1727096018.61288: variable 'ansible_shell_type' from source: unknown 7530 1727096018.61290: variable 'ansible_shell_executable' from source: unknown 7530 1727096018.61292: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.61296: variable 'ansible_pipelining' from source: unknown 7530 1727096018.61299: variable 'ansible_timeout' from source: unknown 7530 1727096018.61303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.61372: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096018.61382: variable 'omit' from source: magic vars 7530 1727096018.61385: starting attempt loop 7530 1727096018.61387: running the handler 7530 1727096018.61396: _low_level_execute_command(): starting 7530 1727096018.61399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096018.61874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.61878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.61885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.61900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.61943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.61946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.61948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.61998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.63730: stdout chunk (state=3): >>>/root <<< 7530 1727096018.63816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.63851: stderr chunk (state=3): >>><<< 7530 1727096018.63854: stdout chunk (state=3): >>><<< 7530 1727096018.63872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.63881: _low_level_execute_command(): starting 7530 1727096018.63886: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395 `" && echo ansible-tmp-1727096018.6387217-7937-64264417255395="` echo /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395 `" ) && sleep 0' 7530 1727096018.64344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.64347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.64349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.64352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.64408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.64412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.64414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.64456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.66441: stdout chunk (state=3): >>>ansible-tmp-1727096018.6387217-7937-64264417255395=/root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395 <<< 7530 1727096018.66541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.66572: stderr chunk (state=3): >>><<< 7530 1727096018.66575: stdout chunk (state=3): >>><<< 7530 1727096018.66591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096018.6387217-7937-64264417255395=/root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.66613: variable 'ansible_module_compression' from source: unknown 7530 1727096018.66651: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096018.66667: variable 'ansible_facts' from source: unknown 7530 1727096018.66710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/AnsiballZ_command.py 7530 1727096018.66811: Sending initial data 7530 1727096018.66814: Sent initial data (153 bytes) 7530 1727096018.67285: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.67289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.67291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096018.67293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.67350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.67362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.67364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.67397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.69040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096018.69075: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096018.69101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyrmvac3z /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/AnsiballZ_command.py <<< 7530 1727096018.69111: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/AnsiballZ_command.py" <<< 7530 1727096018.69138: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 7530 1727096018.69142: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyrmvac3z" to remote "/root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/AnsiballZ_command.py" <<< 7530 1727096018.69773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.69807: stderr chunk (state=3): >>><<< 7530 1727096018.69882: stdout chunk (state=3): >>><<< 7530 1727096018.69891: done transferring module to remote 7530 1727096018.69905: _low_level_execute_command(): starting 7530 1727096018.69913: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/ /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/AnsiballZ_command.py && sleep 0' 7530 1727096018.70505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.70518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096018.70529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.70575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.70598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.70640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.72493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.72525: stderr chunk (state=3): >>><<< 7530 1727096018.72528: stdout chunk (state=3): >>><<< 7530 1727096018.72556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.72560: _low_level_execute_command(): starting 7530 1727096018.72562: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/AnsiballZ_command.py && sleep 0' 7530 1727096018.73205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096018.73223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.73238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.73254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096018.73273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096018.73285: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096018.73298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.73317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096018.73376: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.73438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.73455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.73530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.73570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.89718: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-23 08:53:38.891222", "end": "2024-09-23 08:53:38.895097", "delta": "0:00:00.003875", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096018.91371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096018.91398: stderr chunk (state=3): >>><<< 7530 1727096018.91402: stdout chunk (state=3): >>><<< 7530 1727096018.91417: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-23 08:53:38.891222", "end": "2024-09-23 08:53:38.895097", "delta": "0:00:00.003875", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096018.91445: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096018.91451: _low_level_execute_command(): starting 7530 1727096018.91455: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096018.6387217-7937-64264417255395/ > /dev/null 2>&1 && sleep 0' 7530 1727096018.91916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.91922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.91925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096018.91927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.91929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.91980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.91984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.91986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.92026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.93873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.93904: stderr chunk (state=3): >>><<< 7530 1727096018.93908: stdout chunk (state=3): >>><<< 7530 1727096018.93925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.93928: handler run complete 7530 1727096018.93943: Evaluated conditional (False): False 7530 1727096018.93951: attempt loop complete, returning result 7530 1727096018.93965: variable 'item' from source: unknown 7530 1727096018.94033: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003875", "end": "2024-09-23 08:53:38.895097", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-23 08:53:38.891222" } 7530 1727096018.94153: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.94157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.94159: variable 'omit' from source: magic vars 7530 1727096018.94254: variable 'ansible_distribution_major_version' from source: facts 7530 1727096018.94257: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096018.94380: variable 'type' from source: play vars 7530 1727096018.94383: variable 'state' from source: include params 7530 1727096018.94387: variable 'interface' from source: play vars 7530 1727096018.94389: variable 'current_interfaces' from source: set_fact 7530 1727096018.94399: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7530 1727096018.94401: variable 'omit' from source: magic vars 7530 1727096018.94412: variable 'omit' from source: magic vars 7530 1727096018.94440: variable 'item' from source: unknown 7530 1727096018.94485: variable 'item' from source: unknown 7530 1727096018.94496: variable 'omit' from source: magic vars 7530 1727096018.94516: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096018.94527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096018.94530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096018.94540: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096018.94543: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.94546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.94594: Set connection var ansible_pipelining to False 7530 1727096018.94598: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096018.94604: Set connection var ansible_timeout to 10 7530 1727096018.94617: Set connection var ansible_shell_executable to /bin/sh 7530 1727096018.94622: Set connection var ansible_shell_type to sh 7530 1727096018.94625: Set connection var ansible_connection to ssh 7530 1727096018.94635: variable 'ansible_shell_executable' from source: unknown 7530 1727096018.94638: variable 'ansible_connection' from source: unknown 7530 1727096018.94640: variable 'ansible_module_compression' from source: unknown 7530 1727096018.94643: variable 'ansible_shell_type' from source: unknown 7530 1727096018.94645: variable 'ansible_shell_executable' from source: unknown 7530 1727096018.94647: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096018.94651: variable 'ansible_pipelining' from source: unknown 7530 1727096018.94654: variable 'ansible_timeout' from source: unknown 7530 1727096018.94658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096018.94727: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096018.94730: variable 'omit' from source: magic vars 7530 1727096018.94732: starting attempt loop 7530 1727096018.94735: running the handler 7530 1727096018.94740: _low_level_execute_command(): starting 7530 1727096018.94743: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096018.95206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.95209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.95212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096018.95214: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096018.95216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.95274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.95283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.95287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.95324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.96988: stdout chunk (state=3): >>>/root <<< 7530 1727096018.97079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.97111: stderr chunk (state=3): >>><<< 7530 1727096018.97114: stdout chunk (state=3): >>><<< 7530 1727096018.97133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.97141: _low_level_execute_command(): starting 7530 1727096018.97146: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749 `" && echo ansible-tmp-1727096018.971324-7937-58951191565749="` echo /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749 `" ) && sleep 0' 7530 1727096018.97605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.97609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096018.97611: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.97613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096018.97615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096018.97617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096018.97672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096018.97681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096018.97686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096018.97713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096018.99662: stdout chunk (state=3): >>>ansible-tmp-1727096018.971324-7937-58951191565749=/root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749 <<< 7530 1727096018.99788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096018.99941: stderr chunk (state=3): >>><<< 7530 1727096018.99944: stdout chunk (state=3): >>><<< 7530 1727096018.99947: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096018.971324-7937-58951191565749=/root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096018.99955: variable 'ansible_module_compression' from source: unknown 7530 1727096018.99957: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096018.99984: variable 'ansible_facts' from source: unknown 7530 1727096019.00078: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/AnsiballZ_command.py 7530 1727096019.00250: Sending initial data 7530 1727096019.00277: Sent initial data (152 bytes) 7530 1727096019.00986: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.01008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096019.01085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096019.01173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.01208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096019.01225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.01295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.02893: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096019.02926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096019.02961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpvtdplu4g /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/AnsiballZ_command.py <<< 7530 1727096019.02965: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/AnsiballZ_command.py" <<< 7530 1727096019.02989: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpvtdplu4g" to remote "/root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/AnsiballZ_command.py" <<< 7530 1727096019.02999: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/AnsiballZ_command.py" <<< 7530 1727096019.03491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.03538: stderr chunk (state=3): >>><<< 7530 1727096019.03541: stdout chunk (state=3): >>><<< 7530 1727096019.03574: done transferring module to remote 7530 1727096019.03582: _low_level_execute_command(): starting 7530 1727096019.03591: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/ /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/AnsiballZ_command.py && sleep 0' 7530 1727096019.04183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.04234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.04250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096019.04272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.04356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.06167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.06203: stderr chunk (state=3): >>><<< 7530 1727096019.06206: stdout chunk (state=3): >>><<< 7530 1727096019.06223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096019.06226: _low_level_execute_command(): starting 7530 1727096019.06228: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/AnsiballZ_command.py && sleep 0' 7530 1727096019.06679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.06682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.06689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096019.06691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.06693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.06735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096019.06746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.06794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.22581: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-23 08:53:39.219577", "end": "2024-09-23 08:53:39.223550", "delta": "0:00:00.003973", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096019.24247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096019.24275: stderr chunk (state=3): >>><<< 7530 1727096019.24278: stdout chunk (state=3): >>><<< 7530 1727096019.24292: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-23 08:53:39.219577", "end": "2024-09-23 08:53:39.223550", "delta": "0:00:00.003973", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096019.24318: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096019.24321: _low_level_execute_command(): starting 7530 1727096019.24330: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096018.971324-7937-58951191565749/ > /dev/null 2>&1 && sleep 0' 7530 1727096019.24785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.24789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.24791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.24793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.24843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096019.24847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.24889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.26721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.26752: stderr chunk (state=3): >>><<< 7530 1727096019.26755: stdout chunk (state=3): >>><<< 7530 1727096019.26771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096019.26775: handler run complete 7530 1727096019.26790: Evaluated conditional (False): False 7530 1727096019.26798: attempt loop complete, returning result 7530 1727096019.26812: variable 'item' from source: unknown 7530 1727096019.26881: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003973", "end": "2024-09-23 08:53:39.223550", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-23 08:53:39.219577" } 7530 1727096019.27006: dumping result to json 7530 1727096019.27008: done dumping result, returning 7530 1727096019.27010: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0afff68d-5257-086b-f4f0-0000000003a9] 7530 1727096019.27012: sending task result for task 0afff68d-5257-086b-f4f0-0000000003a9 7530 1727096019.27058: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003a9 7530 1727096019.27060: WORKER PROCESS EXITING 7530 1727096019.27126: no more pending results, returning what we have 7530 1727096019.27129: results queue empty 7530 1727096019.27130: checking for any_errors_fatal 7530 1727096019.27137: done checking for any_errors_fatal 7530 1727096019.27138: checking for max_fail_percentage 7530 1727096019.27139: done checking for max_fail_percentage 7530 1727096019.27140: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.27141: done checking to see if all hosts have failed 7530 1727096019.27142: getting the remaining hosts for this loop 7530 1727096019.27143: done getting the remaining hosts for this loop 7530 1727096019.27146: getting the next task for host managed_node3 7530 1727096019.27151: done getting next task for host managed_node3 7530 1727096019.27154: ^ task is: TASK: Set up veth as managed by NetworkManager 7530 1727096019.27156: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.27159: getting variables 7530 1727096019.27160: in VariableManager get_vars() 7530 1727096019.27211: Calling all_inventory to load vars for managed_node3 7530 1727096019.27214: Calling groups_inventory to load vars for managed_node3 7530 1727096019.27216: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.27228: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.27230: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.27233: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.27386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.27509: done with get_vars() 7530 1727096019.27518: done getting variables 7530 1727096019.27560: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 08:53:39 -0400 (0:00:01.117) 0:00:10.064 ****** 7530 1727096019.27582: entering _queue_task() for managed_node3/command 7530 1727096019.27788: worker is 1 (out of 1 available) 7530 1727096019.27801: exiting _queue_task() for managed_node3/command 7530 1727096019.27812: done queuing things up, now waiting for results queue to drain 7530 1727096019.27814: waiting for pending results... 7530 1727096019.27963: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7530 1727096019.28031: in run() - task 0afff68d-5257-086b-f4f0-0000000003aa 7530 1727096019.28044: variable 'ansible_search_path' from source: unknown 7530 1727096019.28047: variable 'ansible_search_path' from source: unknown 7530 1727096019.28077: calling self._execute() 7530 1727096019.28141: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.28151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.28153: variable 'omit' from source: magic vars 7530 1727096019.28417: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.28427: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.28532: variable 'type' from source: play vars 7530 1727096019.28535: variable 'state' from source: include params 7530 1727096019.28541: Evaluated conditional (type == 'veth' and state == 'present'): True 7530 1727096019.28548: variable 'omit' from source: magic vars 7530 1727096019.28573: variable 'omit' from source: magic vars 7530 1727096019.28643: variable 'interface' from source: play vars 7530 1727096019.28658: variable 'omit' from source: magic vars 7530 1727096019.28694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096019.28724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096019.28738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096019.28751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096019.28760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096019.28784: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096019.28788: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.28790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.28860: Set connection var ansible_pipelining to False 7530 1727096019.28863: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096019.28871: Set connection var ansible_timeout to 10 7530 1727096019.28879: Set connection var ansible_shell_executable to /bin/sh 7530 1727096019.28882: Set connection var ansible_shell_type to sh 7530 1727096019.28884: Set connection var ansible_connection to ssh 7530 1727096019.28902: variable 'ansible_shell_executable' from source: unknown 7530 1727096019.28905: variable 'ansible_connection' from source: unknown 7530 1727096019.28908: variable 'ansible_module_compression' from source: unknown 7530 1727096019.28912: variable 'ansible_shell_type' from source: unknown 7530 1727096019.28914: variable 'ansible_shell_executable' from source: unknown 7530 1727096019.28918: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.28923: variable 'ansible_pipelining' from source: unknown 7530 1727096019.28925: variable 'ansible_timeout' from source: unknown 7530 1727096019.28927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.29026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096019.29032: variable 'omit' from source: magic vars 7530 1727096019.29047: starting attempt loop 7530 1727096019.29050: running the handler 7530 1727096019.29056: _low_level_execute_command(): starting 7530 1727096019.29063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096019.29576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.29580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096019.29585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096019.29588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.29636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.29640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.29686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.31339: stdout chunk (state=3): >>>/root <<< 7530 1727096019.31440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.31472: stderr chunk (state=3): >>><<< 7530 1727096019.31475: stdout chunk (state=3): >>><<< 7530 1727096019.31497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096019.31509: _low_level_execute_command(): starting 7530 1727096019.31516: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142 `" && echo ansible-tmp-1727096019.3149736-7988-47248239975142="` echo /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142 `" ) && sleep 0' 7530 1727096019.31976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096019.31980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096019.31991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.31994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096019.31996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.32038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.32041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.32085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.34005: stdout chunk (state=3): >>>ansible-tmp-1727096019.3149736-7988-47248239975142=/root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142 <<< 7530 1727096019.34174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.34177: stderr chunk (state=3): >>><<< 7530 1727096019.34180: stdout chunk (state=3): >>><<< 7530 1727096019.34183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096019.3149736-7988-47248239975142=/root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096019.34242: variable 'ansible_module_compression' from source: unknown 7530 1727096019.34301: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096019.34347: variable 'ansible_facts' from source: unknown 7530 1727096019.34443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/AnsiballZ_command.py 7530 1727096019.34674: Sending initial data 7530 1727096019.34677: Sent initial data (153 bytes) 7530 1727096019.35139: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096019.35151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096019.35172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.35176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.35237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.35240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.35284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.36920: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096019.36945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096019.36989: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5o5sj1x8 /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/AnsiballZ_command.py <<< 7530 1727096019.36996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/AnsiballZ_command.py" <<< 7530 1727096019.37021: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5o5sj1x8" to remote "/root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/AnsiballZ_command.py" <<< 7530 1727096019.37025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/AnsiballZ_command.py" <<< 7530 1727096019.37774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.37778: stdout chunk (state=3): >>><<< 7530 1727096019.37780: stderr chunk (state=3): >>><<< 7530 1727096019.37782: done transferring module to remote 7530 1727096019.37784: _low_level_execute_command(): starting 7530 1727096019.37795: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/ /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/AnsiballZ_command.py && sleep 0' 7530 1727096019.38365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.38380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.38392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.38450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.38473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.38493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.40374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.40494: stdout chunk (state=3): >>><<< 7530 1727096019.40498: stderr chunk (state=3): >>><<< 7530 1727096019.40501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096019.40503: _low_level_execute_command(): starting 7530 1727096019.40506: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/AnsiballZ_command.py && sleep 0' 7530 1727096019.41099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096019.41112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096019.41163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.41201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096019.41212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.41276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.41290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.41344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.67343: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-23 08:53:39.566290", "end": "2024-09-23 08:53:39.671175", "delta": "0:00:00.104885", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096019.69181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096019.69185: stdout chunk (state=3): >>><<< 7530 1727096019.69187: stderr chunk (state=3): >>><<< 7530 1727096019.69190: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-23 08:53:39.566290", "end": "2024-09-23 08:53:39.671175", "delta": "0:00:00.104885", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096019.69193: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096019.69195: _low_level_execute_command(): starting 7530 1727096019.69197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096019.3149736-7988-47248239975142/ > /dev/null 2>&1 && sleep 0' 7530 1727096019.69836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096019.69864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096019.69952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.69996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.70016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096019.70049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096019.70125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096019.72103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096019.72107: stdout chunk (state=3): >>><<< 7530 1727096019.72110: stderr chunk (state=3): >>><<< 7530 1727096019.72276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096019.72279: handler run complete 7530 1727096019.72282: Evaluated conditional (False): False 7530 1727096019.72284: attempt loop complete, returning result 7530 1727096019.72287: _execute() done 7530 1727096019.72289: dumping result to json 7530 1727096019.72291: done dumping result, returning 7530 1727096019.72293: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-086b-f4f0-0000000003aa] 7530 1727096019.72296: sending task result for task 0afff68d-5257-086b-f4f0-0000000003aa 7530 1727096019.72372: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003aa 7530 1727096019.72374: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.104885", "end": "2024-09-23 08:53:39.671175", "rc": 0, "start": "2024-09-23 08:53:39.566290" } 7530 1727096019.72537: no more pending results, returning what we have 7530 1727096019.72540: results queue empty 7530 1727096019.72541: checking for any_errors_fatal 7530 1727096019.72555: done checking for any_errors_fatal 7530 1727096019.72556: checking for max_fail_percentage 7530 1727096019.72558: done checking for max_fail_percentage 7530 1727096019.72559: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.72560: done checking to see if all hosts have failed 7530 1727096019.72560: getting the remaining hosts for this loop 7530 1727096019.72562: done getting the remaining hosts for this loop 7530 1727096019.72565: getting the next task for host managed_node3 7530 1727096019.72574: done getting next task for host managed_node3 7530 1727096019.72576: ^ task is: TASK: Delete veth interface {{ interface }} 7530 1727096019.72580: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.72585: getting variables 7530 1727096019.72586: in VariableManager get_vars() 7530 1727096019.72636: Calling all_inventory to load vars for managed_node3 7530 1727096019.72639: Calling groups_inventory to load vars for managed_node3 7530 1727096019.72641: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.72652: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.72654: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.72656: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.72990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.73194: done with get_vars() 7530 1727096019.73206: done getting variables 7530 1727096019.73262: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096019.73383: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 08:53:39 -0400 (0:00:00.458) 0:00:10.522 ****** 7530 1727096019.73411: entering _queue_task() for managed_node3/command 7530 1727096019.73899: worker is 1 (out of 1 available) 7530 1727096019.73907: exiting _queue_task() for managed_node3/command 7530 1727096019.73917: done queuing things up, now waiting for results queue to drain 7530 1727096019.73918: waiting for pending results... 7530 1727096019.74092: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7530 1727096019.74098: in run() - task 0afff68d-5257-086b-f4f0-0000000003ab 7530 1727096019.74103: variable 'ansible_search_path' from source: unknown 7530 1727096019.74106: variable 'ansible_search_path' from source: unknown 7530 1727096019.74118: calling self._execute() 7530 1727096019.74216: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.74253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.74257: variable 'omit' from source: magic vars 7530 1727096019.74625: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.74683: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.74849: variable 'type' from source: play vars 7530 1727096019.74859: variable 'state' from source: include params 7530 1727096019.74871: variable 'interface' from source: play vars 7530 1727096019.74881: variable 'current_interfaces' from source: set_fact 7530 1727096019.74893: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7530 1727096019.74906: when evaluation is False, skipping this task 7530 1727096019.74972: _execute() done 7530 1727096019.74975: dumping result to json 7530 1727096019.74978: done dumping result, returning 7530 1727096019.74980: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0afff68d-5257-086b-f4f0-0000000003ab] 7530 1727096019.74982: sending task result for task 0afff68d-5257-086b-f4f0-0000000003ab 7530 1727096019.75273: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003ab 7530 1727096019.75277: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096019.75317: no more pending results, returning what we have 7530 1727096019.75321: results queue empty 7530 1727096019.75322: checking for any_errors_fatal 7530 1727096019.75330: done checking for any_errors_fatal 7530 1727096019.75331: checking for max_fail_percentage 7530 1727096019.75333: done checking for max_fail_percentage 7530 1727096019.75334: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.75335: done checking to see if all hosts have failed 7530 1727096019.75335: getting the remaining hosts for this loop 7530 1727096019.75337: done getting the remaining hosts for this loop 7530 1727096019.75340: getting the next task for host managed_node3 7530 1727096019.75346: done getting next task for host managed_node3 7530 1727096019.75349: ^ task is: TASK: Create dummy interface {{ interface }} 7530 1727096019.75352: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.75356: getting variables 7530 1727096019.75358: in VariableManager get_vars() 7530 1727096019.75404: Calling all_inventory to load vars for managed_node3 7530 1727096019.75408: Calling groups_inventory to load vars for managed_node3 7530 1727096019.75410: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.75420: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.75423: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.75426: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.75597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.75857: done with get_vars() 7530 1727096019.75870: done getting variables 7530 1727096019.75926: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096019.76040: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 08:53:39 -0400 (0:00:00.026) 0:00:10.549 ****** 7530 1727096019.76074: entering _queue_task() for managed_node3/command 7530 1727096019.76330: worker is 1 (out of 1 available) 7530 1727096019.76343: exiting _queue_task() for managed_node3/command 7530 1727096019.76355: done queuing things up, now waiting for results queue to drain 7530 1727096019.76357: waiting for pending results... 7530 1727096019.76606: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7530 1727096019.76721: in run() - task 0afff68d-5257-086b-f4f0-0000000003ac 7530 1727096019.76741: variable 'ansible_search_path' from source: unknown 7530 1727096019.76748: variable 'ansible_search_path' from source: unknown 7530 1727096019.76788: calling self._execute() 7530 1727096019.76873: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.76885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.76896: variable 'omit' from source: magic vars 7530 1727096019.77239: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.77258: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.77466: variable 'type' from source: play vars 7530 1727096019.77555: variable 'state' from source: include params 7530 1727096019.77558: variable 'interface' from source: play vars 7530 1727096019.77561: variable 'current_interfaces' from source: set_fact 7530 1727096019.77564: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7530 1727096019.77566: when evaluation is False, skipping this task 7530 1727096019.77570: _execute() done 7530 1727096019.77572: dumping result to json 7530 1727096019.77574: done dumping result, returning 7530 1727096019.77576: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0afff68d-5257-086b-f4f0-0000000003ac] 7530 1727096019.77578: sending task result for task 0afff68d-5257-086b-f4f0-0000000003ac 7530 1727096019.77641: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003ac 7530 1727096019.77644: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096019.77697: no more pending results, returning what we have 7530 1727096019.77701: results queue empty 7530 1727096019.77702: checking for any_errors_fatal 7530 1727096019.77710: done checking for any_errors_fatal 7530 1727096019.77711: checking for max_fail_percentage 7530 1727096019.77713: done checking for max_fail_percentage 7530 1727096019.77714: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.77715: done checking to see if all hosts have failed 7530 1727096019.77715: getting the remaining hosts for this loop 7530 1727096019.77717: done getting the remaining hosts for this loop 7530 1727096019.77721: getting the next task for host managed_node3 7530 1727096019.77728: done getting next task for host managed_node3 7530 1727096019.77731: ^ task is: TASK: Delete dummy interface {{ interface }} 7530 1727096019.77735: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.77739: getting variables 7530 1727096019.77741: in VariableManager get_vars() 7530 1727096019.77798: Calling all_inventory to load vars for managed_node3 7530 1727096019.77801: Calling groups_inventory to load vars for managed_node3 7530 1727096019.77803: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.77817: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.77820: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.77824: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.78205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.78434: done with get_vars() 7530 1727096019.78445: done getting variables 7530 1727096019.78503: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096019.78619: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 08:53:39 -0400 (0:00:00.025) 0:00:10.574 ****** 7530 1727096019.78648: entering _queue_task() for managed_node3/command 7530 1727096019.78898: worker is 1 (out of 1 available) 7530 1727096019.78911: exiting _queue_task() for managed_node3/command 7530 1727096019.78923: done queuing things up, now waiting for results queue to drain 7530 1727096019.78925: waiting for pending results... 7530 1727096019.79282: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7530 1727096019.79287: in run() - task 0afff68d-5257-086b-f4f0-0000000003ad 7530 1727096019.79293: variable 'ansible_search_path' from source: unknown 7530 1727096019.79300: variable 'ansible_search_path' from source: unknown 7530 1727096019.79337: calling self._execute() 7530 1727096019.79432: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.79444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.79458: variable 'omit' from source: magic vars 7530 1727096019.79822: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.79838: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.80042: variable 'type' from source: play vars 7530 1727096019.80052: variable 'state' from source: include params 7530 1727096019.80131: variable 'interface' from source: play vars 7530 1727096019.80135: variable 'current_interfaces' from source: set_fact 7530 1727096019.80138: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7530 1727096019.80140: when evaluation is False, skipping this task 7530 1727096019.80142: _execute() done 7530 1727096019.80144: dumping result to json 7530 1727096019.80146: done dumping result, returning 7530 1727096019.80149: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0afff68d-5257-086b-f4f0-0000000003ad] 7530 1727096019.80151: sending task result for task 0afff68d-5257-086b-f4f0-0000000003ad 7530 1727096019.80217: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003ad 7530 1727096019.80220: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096019.80287: no more pending results, returning what we have 7530 1727096019.80291: results queue empty 7530 1727096019.80292: checking for any_errors_fatal 7530 1727096019.80299: done checking for any_errors_fatal 7530 1727096019.80300: checking for max_fail_percentage 7530 1727096019.80302: done checking for max_fail_percentage 7530 1727096019.80303: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.80304: done checking to see if all hosts have failed 7530 1727096019.80305: getting the remaining hosts for this loop 7530 1727096019.80306: done getting the remaining hosts for this loop 7530 1727096019.80310: getting the next task for host managed_node3 7530 1727096019.80317: done getting next task for host managed_node3 7530 1727096019.80320: ^ task is: TASK: Create tap interface {{ interface }} 7530 1727096019.80323: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.80328: getting variables 7530 1727096019.80330: in VariableManager get_vars() 7530 1727096019.80385: Calling all_inventory to load vars for managed_node3 7530 1727096019.80388: Calling groups_inventory to load vars for managed_node3 7530 1727096019.80391: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.80405: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.80408: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.80411: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.80844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.81057: done with get_vars() 7530 1727096019.81070: done getting variables 7530 1727096019.81126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096019.81233: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 08:53:39 -0400 (0:00:00.026) 0:00:10.600 ****** 7530 1727096019.81262: entering _queue_task() for managed_node3/command 7530 1727096019.81512: worker is 1 (out of 1 available) 7530 1727096019.81523: exiting _queue_task() for managed_node3/command 7530 1727096019.81536: done queuing things up, now waiting for results queue to drain 7530 1727096019.81538: waiting for pending results... 7530 1727096019.81985: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7530 1727096019.81990: in run() - task 0afff68d-5257-086b-f4f0-0000000003ae 7530 1727096019.81994: variable 'ansible_search_path' from source: unknown 7530 1727096019.81997: variable 'ansible_search_path' from source: unknown 7530 1727096019.82001: calling self._execute() 7530 1727096019.82044: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.82056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.82072: variable 'omit' from source: magic vars 7530 1727096019.82430: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.82451: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.82656: variable 'type' from source: play vars 7530 1727096019.82673: variable 'state' from source: include params 7530 1727096019.82683: variable 'interface' from source: play vars 7530 1727096019.82692: variable 'current_interfaces' from source: set_fact 7530 1727096019.82702: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7530 1727096019.82707: when evaluation is False, skipping this task 7530 1727096019.82712: _execute() done 7530 1727096019.82717: dumping result to json 7530 1727096019.82723: done dumping result, returning 7530 1727096019.82731: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0afff68d-5257-086b-f4f0-0000000003ae] 7530 1727096019.82739: sending task result for task 0afff68d-5257-086b-f4f0-0000000003ae 7530 1727096019.82977: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003ae 7530 1727096019.82980: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096019.83035: no more pending results, returning what we have 7530 1727096019.83039: results queue empty 7530 1727096019.83040: checking for any_errors_fatal 7530 1727096019.83047: done checking for any_errors_fatal 7530 1727096019.83047: checking for max_fail_percentage 7530 1727096019.83049: done checking for max_fail_percentage 7530 1727096019.83050: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.83051: done checking to see if all hosts have failed 7530 1727096019.83051: getting the remaining hosts for this loop 7530 1727096019.83053: done getting the remaining hosts for this loop 7530 1727096019.83057: getting the next task for host managed_node3 7530 1727096019.83063: done getting next task for host managed_node3 7530 1727096019.83066: ^ task is: TASK: Delete tap interface {{ interface }} 7530 1727096019.83071: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.83076: getting variables 7530 1727096019.83077: in VariableManager get_vars() 7530 1727096019.83133: Calling all_inventory to load vars for managed_node3 7530 1727096019.83136: Calling groups_inventory to load vars for managed_node3 7530 1727096019.83138: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.83152: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.83155: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.83157: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.83448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.83645: done with get_vars() 7530 1727096019.83658: done getting variables 7530 1727096019.83721: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096019.83842: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 08:53:39 -0400 (0:00:00.026) 0:00:10.627 ****** 7530 1727096019.83877: entering _queue_task() for managed_node3/command 7530 1727096019.84163: worker is 1 (out of 1 available) 7530 1727096019.84178: exiting _queue_task() for managed_node3/command 7530 1727096019.84191: done queuing things up, now waiting for results queue to drain 7530 1727096019.84193: waiting for pending results... 7530 1727096019.84589: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7530 1727096019.84595: in run() - task 0afff68d-5257-086b-f4f0-0000000003af 7530 1727096019.84599: variable 'ansible_search_path' from source: unknown 7530 1727096019.84602: variable 'ansible_search_path' from source: unknown 7530 1727096019.84642: calling self._execute() 7530 1727096019.84738: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.84749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.84763: variable 'omit' from source: magic vars 7530 1727096019.85143: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.85162: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.85375: variable 'type' from source: play vars 7530 1727096019.85387: variable 'state' from source: include params 7530 1727096019.85397: variable 'interface' from source: play vars 7530 1727096019.85406: variable 'current_interfaces' from source: set_fact 7530 1727096019.85419: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7530 1727096019.85427: when evaluation is False, skipping this task 7530 1727096019.85434: _execute() done 7530 1727096019.85440: dumping result to json 7530 1727096019.85452: done dumping result, returning 7530 1727096019.85464: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0afff68d-5257-086b-f4f0-0000000003af] 7530 1727096019.85477: sending task result for task 0afff68d-5257-086b-f4f0-0000000003af 7530 1727096019.85625: done sending task result for task 0afff68d-5257-086b-f4f0-0000000003af 7530 1727096019.85628: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096019.85710: no more pending results, returning what we have 7530 1727096019.85713: results queue empty 7530 1727096019.85714: checking for any_errors_fatal 7530 1727096019.85720: done checking for any_errors_fatal 7530 1727096019.85721: checking for max_fail_percentage 7530 1727096019.85723: done checking for max_fail_percentage 7530 1727096019.85723: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.85724: done checking to see if all hosts have failed 7530 1727096019.85725: getting the remaining hosts for this loop 7530 1727096019.85727: done getting the remaining hosts for this loop 7530 1727096019.85731: getting the next task for host managed_node3 7530 1727096019.85739: done getting next task for host managed_node3 7530 1727096019.85742: ^ task is: TASK: Include the task 'assert_device_present.yml' 7530 1727096019.85745: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.85749: getting variables 7530 1727096019.85751: in VariableManager get_vars() 7530 1727096019.85804: Calling all_inventory to load vars for managed_node3 7530 1727096019.85807: Calling groups_inventory to load vars for managed_node3 7530 1727096019.85809: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.85822: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.85825: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.85828: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.86233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.86414: done with get_vars() 7530 1727096019.86426: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:15 Monday 23 September 2024 08:53:39 -0400 (0:00:00.026) 0:00:10.653 ****** 7530 1727096019.86515: entering _queue_task() for managed_node3/include_tasks 7530 1727096019.86779: worker is 1 (out of 1 available) 7530 1727096019.86790: exiting _queue_task() for managed_node3/include_tasks 7530 1727096019.86801: done queuing things up, now waiting for results queue to drain 7530 1727096019.86803: waiting for pending results... 7530 1727096019.87188: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7530 1727096019.87193: in run() - task 0afff68d-5257-086b-f4f0-00000000000d 7530 1727096019.87197: variable 'ansible_search_path' from source: unknown 7530 1727096019.87219: calling self._execute() 7530 1727096019.87311: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.87324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.87339: variable 'omit' from source: magic vars 7530 1727096019.87716: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.87738: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.87832: _execute() done 7530 1727096019.87836: dumping result to json 7530 1727096019.87838: done dumping result, returning 7530 1727096019.87841: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-086b-f4f0-00000000000d] 7530 1727096019.87843: sending task result for task 0afff68d-5257-086b-f4f0-00000000000d 7530 1727096019.87916: done sending task result for task 0afff68d-5257-086b-f4f0-00000000000d 7530 1727096019.87919: WORKER PROCESS EXITING 7530 1727096019.87964: no more pending results, returning what we have 7530 1727096019.87971: in VariableManager get_vars() 7530 1727096019.88032: Calling all_inventory to load vars for managed_node3 7530 1727096019.88035: Calling groups_inventory to load vars for managed_node3 7530 1727096019.88037: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.88052: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.88055: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.88059: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.88359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.88662: done with get_vars() 7530 1727096019.88672: variable 'ansible_search_path' from source: unknown 7530 1727096019.88686: we have included files to process 7530 1727096019.88687: generating all_blocks data 7530 1727096019.88689: done generating all_blocks data 7530 1727096019.88693: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096019.88694: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096019.88696: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096019.88852: in VariableManager get_vars() 7530 1727096019.88881: done with get_vars() 7530 1727096019.88986: done processing included file 7530 1727096019.88988: iterating over new_blocks loaded from include file 7530 1727096019.88990: in VariableManager get_vars() 7530 1727096019.89013: done with get_vars() 7530 1727096019.89015: filtering new block on tags 7530 1727096019.89031: done filtering new block on tags 7530 1727096019.89034: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7530 1727096019.89039: extending task lists for all hosts with included blocks 7530 1727096019.93182: done extending task lists 7530 1727096019.93185: done processing included files 7530 1727096019.93186: results queue empty 7530 1727096019.93187: checking for any_errors_fatal 7530 1727096019.93190: done checking for any_errors_fatal 7530 1727096019.93191: checking for max_fail_percentage 7530 1727096019.93193: done checking for max_fail_percentage 7530 1727096019.93193: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.93194: done checking to see if all hosts have failed 7530 1727096019.93195: getting the remaining hosts for this loop 7530 1727096019.93197: done getting the remaining hosts for this loop 7530 1727096019.93199: getting the next task for host managed_node3 7530 1727096019.93203: done getting next task for host managed_node3 7530 1727096019.93206: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7530 1727096019.93209: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.93211: getting variables 7530 1727096019.93212: in VariableManager get_vars() 7530 1727096019.93237: Calling all_inventory to load vars for managed_node3 7530 1727096019.93240: Calling groups_inventory to load vars for managed_node3 7530 1727096019.93242: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.93249: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.93251: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.93254: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.93408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.93593: done with get_vars() 7530 1727096019.93605: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:53:39 -0400 (0:00:00.071) 0:00:10.725 ****** 7530 1727096019.93676: entering _queue_task() for managed_node3/include_tasks 7530 1727096019.93956: worker is 1 (out of 1 available) 7530 1727096019.94171: exiting _queue_task() for managed_node3/include_tasks 7530 1727096019.94182: done queuing things up, now waiting for results queue to drain 7530 1727096019.94183: waiting for pending results... 7530 1727096019.94314: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7530 1727096019.94349: in run() - task 0afff68d-5257-086b-f4f0-0000000005f5 7530 1727096019.94373: variable 'ansible_search_path' from source: unknown 7530 1727096019.94381: variable 'ansible_search_path' from source: unknown 7530 1727096019.94427: calling self._execute() 7530 1727096019.94520: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.94533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.94548: variable 'omit' from source: magic vars 7530 1727096019.95061: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.95064: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.95069: _execute() done 7530 1727096019.95072: dumping result to json 7530 1727096019.95074: done dumping result, returning 7530 1727096019.95077: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-086b-f4f0-0000000005f5] 7530 1727096019.95079: sending task result for task 0afff68d-5257-086b-f4f0-0000000005f5 7530 1727096019.95149: done sending task result for task 0afff68d-5257-086b-f4f0-0000000005f5 7530 1727096019.95152: WORKER PROCESS EXITING 7530 1727096019.95195: no more pending results, returning what we have 7530 1727096019.95201: in VariableManager get_vars() 7530 1727096019.95258: Calling all_inventory to load vars for managed_node3 7530 1727096019.95262: Calling groups_inventory to load vars for managed_node3 7530 1727096019.95265: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.95280: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.95284: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.95287: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.95693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.95888: done with get_vars() 7530 1727096019.95896: variable 'ansible_search_path' from source: unknown 7530 1727096019.95897: variable 'ansible_search_path' from source: unknown 7530 1727096019.95933: we have included files to process 7530 1727096019.95934: generating all_blocks data 7530 1727096019.95936: done generating all_blocks data 7530 1727096019.95937: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096019.95938: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096019.95941: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096019.96166: done processing included file 7530 1727096019.96170: iterating over new_blocks loaded from include file 7530 1727096019.96172: in VariableManager get_vars() 7530 1727096019.96197: done with get_vars() 7530 1727096019.96199: filtering new block on tags 7530 1727096019.96214: done filtering new block on tags 7530 1727096019.96217: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7530 1727096019.96222: extending task lists for all hosts with included blocks 7530 1727096019.96322: done extending task lists 7530 1727096019.96323: done processing included files 7530 1727096019.96324: results queue empty 7530 1727096019.96325: checking for any_errors_fatal 7530 1727096019.96328: done checking for any_errors_fatal 7530 1727096019.96330: checking for max_fail_percentage 7530 1727096019.96331: done checking for max_fail_percentage 7530 1727096019.96331: checking to see if all hosts have failed and the running result is not ok 7530 1727096019.96332: done checking to see if all hosts have failed 7530 1727096019.96333: getting the remaining hosts for this loop 7530 1727096019.96334: done getting the remaining hosts for this loop 7530 1727096019.96337: getting the next task for host managed_node3 7530 1727096019.96341: done getting next task for host managed_node3 7530 1727096019.96343: ^ task is: TASK: Get stat for interface {{ interface }} 7530 1727096019.96346: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096019.96349: getting variables 7530 1727096019.96350: in VariableManager get_vars() 7530 1727096019.96369: Calling all_inventory to load vars for managed_node3 7530 1727096019.96371: Calling groups_inventory to load vars for managed_node3 7530 1727096019.96373: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096019.96379: Calling all_plugins_play to load vars for managed_node3 7530 1727096019.96381: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096019.96384: Calling groups_plugins_play to load vars for managed_node3 7530 1727096019.96521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096019.96728: done with get_vars() 7530 1727096019.96740: done getting variables 7530 1727096019.96887: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:53:39 -0400 (0:00:00.032) 0:00:10.757 ****** 7530 1727096019.96916: entering _queue_task() for managed_node3/stat 7530 1727096019.97188: worker is 1 (out of 1 available) 7530 1727096019.97200: exiting _queue_task() for managed_node3/stat 7530 1727096019.97212: done queuing things up, now waiting for results queue to drain 7530 1727096019.97214: waiting for pending results... 7530 1727096019.97474: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7530 1727096019.97592: in run() - task 0afff68d-5257-086b-f4f0-0000000007ee 7530 1727096019.97609: variable 'ansible_search_path' from source: unknown 7530 1727096019.97675: variable 'ansible_search_path' from source: unknown 7530 1727096019.97680: calling self._execute() 7530 1727096019.97749: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.97762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.97783: variable 'omit' from source: magic vars 7530 1727096019.98132: variable 'ansible_distribution_major_version' from source: facts 7530 1727096019.98149: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096019.98160: variable 'omit' from source: magic vars 7530 1727096019.98207: variable 'omit' from source: magic vars 7530 1727096019.98308: variable 'interface' from source: play vars 7530 1727096019.98337: variable 'omit' from source: magic vars 7530 1727096019.98472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096019.98476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096019.98479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096019.98481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096019.98483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096019.98508: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096019.98517: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.98525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.98633: Set connection var ansible_pipelining to False 7530 1727096019.98645: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096019.98656: Set connection var ansible_timeout to 10 7530 1727096019.98672: Set connection var ansible_shell_executable to /bin/sh 7530 1727096019.98680: Set connection var ansible_shell_type to sh 7530 1727096019.98686: Set connection var ansible_connection to ssh 7530 1727096019.98720: variable 'ansible_shell_executable' from source: unknown 7530 1727096019.98816: variable 'ansible_connection' from source: unknown 7530 1727096019.98819: variable 'ansible_module_compression' from source: unknown 7530 1727096019.98821: variable 'ansible_shell_type' from source: unknown 7530 1727096019.98823: variable 'ansible_shell_executable' from source: unknown 7530 1727096019.98825: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096019.98826: variable 'ansible_pipelining' from source: unknown 7530 1727096019.98828: variable 'ansible_timeout' from source: unknown 7530 1727096019.98830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096019.98959: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096019.98981: variable 'omit' from source: magic vars 7530 1727096019.98992: starting attempt loop 7530 1727096019.99001: running the handler 7530 1727096019.99022: _low_level_execute_command(): starting 7530 1727096019.99040: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096019.99794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096019.99891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096019.99929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096019.99948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096019.99974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.00050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.01749: stdout chunk (state=3): >>>/root <<< 7530 1727096020.01905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.01909: stdout chunk (state=3): >>><<< 7530 1727096020.01911: stderr chunk (state=3): >>><<< 7530 1727096020.01931: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096020.01953: _low_level_execute_command(): starting 7530 1727096020.01965: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838 `" && echo ansible-tmp-1727096020.0193887-8015-195995802995838="` echo /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838 `" ) && sleep 0' 7530 1727096020.02802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096020.02875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096020.02893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.02952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.02973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.03000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.03123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.05174: stdout chunk (state=3): >>>ansible-tmp-1727096020.0193887-8015-195995802995838=/root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838 <<< 7530 1727096020.05255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.05575: stderr chunk (state=3): >>><<< 7530 1727096020.05579: stdout chunk (state=3): >>><<< 7530 1727096020.05582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096020.0193887-8015-195995802995838=/root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096020.05585: variable 'ansible_module_compression' from source: unknown 7530 1727096020.05636: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7530 1727096020.05920: variable 'ansible_facts' from source: unknown 7530 1727096020.05924: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/AnsiballZ_stat.py 7530 1727096020.06273: Sending initial data 7530 1727096020.06277: Sent initial data (151 bytes) 7530 1727096020.07507: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096020.07548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.07591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.07800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.07825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.07857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.09495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096020.09520: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096020.09573: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpqax5d4_4 /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/AnsiballZ_stat.py <<< 7530 1727096020.09582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/AnsiballZ_stat.py" <<< 7530 1727096020.09628: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpqax5d4_4" to remote "/root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/AnsiballZ_stat.py" <<< 7530 1727096020.10369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.10383: stderr chunk (state=3): >>><<< 7530 1727096020.10444: stdout chunk (state=3): >>><<< 7530 1727096020.10454: done transferring module to remote 7530 1727096020.10477: _low_level_execute_command(): starting 7530 1727096020.10488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/ /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/AnsiballZ_stat.py && sleep 0' 7530 1727096020.11174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096020.11192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096020.11217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.11276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096020.11294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.11363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.11391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.11417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.11492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.13337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.13362: stderr chunk (state=3): >>><<< 7530 1727096020.13364: stdout chunk (state=3): >>><<< 7530 1727096020.13378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096020.13388: _low_level_execute_command(): starting 7530 1727096020.13391: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/AnsiballZ_stat.py && sleep 0' 7530 1727096020.13840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.13844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.13847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096020.13849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.13898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.13901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.13908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.13949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.29936: stdout chunk (state=3): >>> <<< 7530 1727096020.29960: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25177, "dev": 23, "nlink": 1, "atime": 1727096018.5148296, "mtime": 1727096018.5148296, "ctime": 1727096018.5148296, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7530 1727096020.31340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096020.31370: stderr chunk (state=3): >>><<< 7530 1727096020.31374: stdout chunk (state=3): >>><<< 7530 1727096020.31389: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25177, "dev": 23, "nlink": 1, "atime": 1727096018.5148296, "mtime": 1727096018.5148296, "ctime": 1727096018.5148296, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096020.31428: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096020.31436: _low_level_execute_command(): starting 7530 1727096020.31441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096020.0193887-8015-195995802995838/ > /dev/null 2>&1 && sleep 0' 7530 1727096020.31906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096020.31910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096020.31917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096020.31919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096020.31921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.31973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.31976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.31978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.32014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.33875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.33905: stderr chunk (state=3): >>><<< 7530 1727096020.33908: stdout chunk (state=3): >>><<< 7530 1727096020.33924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096020.33932: handler run complete 7530 1727096020.33962: attempt loop complete, returning result 7530 1727096020.33964: _execute() done 7530 1727096020.33967: dumping result to json 7530 1727096020.33974: done dumping result, returning 7530 1727096020.33981: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0afff68d-5257-086b-f4f0-0000000007ee] 7530 1727096020.33985: sending task result for task 0afff68d-5257-086b-f4f0-0000000007ee 7530 1727096020.34092: done sending task result for task 0afff68d-5257-086b-f4f0-0000000007ee 7530 1727096020.34095: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096018.5148296, "block_size": 4096, "blocks": 0, "ctime": 1727096018.5148296, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25177, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727096018.5148296, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7530 1727096020.34191: no more pending results, returning what we have 7530 1727096020.34195: results queue empty 7530 1727096020.34197: checking for any_errors_fatal 7530 1727096020.34199: done checking for any_errors_fatal 7530 1727096020.34199: checking for max_fail_percentage 7530 1727096020.34201: done checking for max_fail_percentage 7530 1727096020.34202: checking to see if all hosts have failed and the running result is not ok 7530 1727096020.34203: done checking to see if all hosts have failed 7530 1727096020.34203: getting the remaining hosts for this loop 7530 1727096020.34205: done getting the remaining hosts for this loop 7530 1727096020.34209: getting the next task for host managed_node3 7530 1727096020.34217: done getting next task for host managed_node3 7530 1727096020.34220: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7530 1727096020.34223: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096020.34226: getting variables 7530 1727096020.34228: in VariableManager get_vars() 7530 1727096020.34277: Calling all_inventory to load vars for managed_node3 7530 1727096020.34279: Calling groups_inventory to load vars for managed_node3 7530 1727096020.34282: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.34291: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.34294: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.34296: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.34423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.34543: done with get_vars() 7530 1727096020.34552: done getting variables 7530 1727096020.34628: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 7530 1727096020.34718: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:53:40 -0400 (0:00:00.378) 0:00:11.135 ****** 7530 1727096020.34742: entering _queue_task() for managed_node3/assert 7530 1727096020.34743: Creating lock for assert 7530 1727096020.34971: worker is 1 (out of 1 available) 7530 1727096020.34985: exiting _queue_task() for managed_node3/assert 7530 1727096020.34999: done queuing things up, now waiting for results queue to drain 7530 1727096020.35001: waiting for pending results... 7530 1727096020.35166: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7530 1727096020.35239: in run() - task 0afff68d-5257-086b-f4f0-0000000005f6 7530 1727096020.35250: variable 'ansible_search_path' from source: unknown 7530 1727096020.35254: variable 'ansible_search_path' from source: unknown 7530 1727096020.35284: calling self._execute() 7530 1727096020.35355: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.35359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.35372: variable 'omit' from source: magic vars 7530 1727096020.35938: variable 'ansible_distribution_major_version' from source: facts 7530 1727096020.35946: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096020.35952: variable 'omit' from source: magic vars 7530 1727096020.35978: variable 'omit' from source: magic vars 7530 1727096020.36053: variable 'interface' from source: play vars 7530 1727096020.36069: variable 'omit' from source: magic vars 7530 1727096020.36107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096020.36133: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096020.36149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096020.36161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096020.36171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096020.36195: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096020.36200: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.36204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.36274: Set connection var ansible_pipelining to False 7530 1727096020.36280: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096020.36285: Set connection var ansible_timeout to 10 7530 1727096020.36293: Set connection var ansible_shell_executable to /bin/sh 7530 1727096020.36296: Set connection var ansible_shell_type to sh 7530 1727096020.36298: Set connection var ansible_connection to ssh 7530 1727096020.36321: variable 'ansible_shell_executable' from source: unknown 7530 1727096020.36325: variable 'ansible_connection' from source: unknown 7530 1727096020.36327: variable 'ansible_module_compression' from source: unknown 7530 1727096020.36330: variable 'ansible_shell_type' from source: unknown 7530 1727096020.36332: variable 'ansible_shell_executable' from source: unknown 7530 1727096020.36334: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.36336: variable 'ansible_pipelining' from source: unknown 7530 1727096020.36338: variable 'ansible_timeout' from source: unknown 7530 1727096020.36340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.36443: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096020.36452: variable 'omit' from source: magic vars 7530 1727096020.36455: starting attempt loop 7530 1727096020.36457: running the handler 7530 1727096020.36549: variable 'interface_stat' from source: set_fact 7530 1727096020.36563: Evaluated conditional (interface_stat.stat.exists): True 7530 1727096020.36572: handler run complete 7530 1727096020.36581: attempt loop complete, returning result 7530 1727096020.36584: _execute() done 7530 1727096020.36586: dumping result to json 7530 1727096020.36589: done dumping result, returning 7530 1727096020.36595: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0afff68d-5257-086b-f4f0-0000000005f6] 7530 1727096020.36600: sending task result for task 0afff68d-5257-086b-f4f0-0000000005f6 7530 1727096020.36682: done sending task result for task 0afff68d-5257-086b-f4f0-0000000005f6 7530 1727096020.36684: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096020.36734: no more pending results, returning what we have 7530 1727096020.36738: results queue empty 7530 1727096020.36739: checking for any_errors_fatal 7530 1727096020.36743: done checking for any_errors_fatal 7530 1727096020.36744: checking for max_fail_percentage 7530 1727096020.36746: done checking for max_fail_percentage 7530 1727096020.36746: checking to see if all hosts have failed and the running result is not ok 7530 1727096020.36747: done checking to see if all hosts have failed 7530 1727096020.36748: getting the remaining hosts for this loop 7530 1727096020.36749: done getting the remaining hosts for this loop 7530 1727096020.36752: getting the next task for host managed_node3 7530 1727096020.36759: done getting next task for host managed_node3 7530 1727096020.36762: ^ task is: TASK: TEST: I can configure an interface with auto_gateway enabled 7530 1727096020.36764: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096020.36769: getting variables 7530 1727096020.36771: in VariableManager get_vars() 7530 1727096020.36823: Calling all_inventory to load vars for managed_node3 7530 1727096020.36826: Calling groups_inventory to load vars for managed_node3 7530 1727096020.36828: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.36837: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.36840: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.36842: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.37203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.37314: done with get_vars() 7530 1727096020.37324: done getting variables 7530 1727096020.37365: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway enabled] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:17 Monday 23 September 2024 08:53:40 -0400 (0:00:00.026) 0:00:11.162 ****** 7530 1727096020.37386: entering _queue_task() for managed_node3/debug 7530 1727096020.37588: worker is 1 (out of 1 available) 7530 1727096020.37599: exiting _queue_task() for managed_node3/debug 7530 1727096020.37610: done queuing things up, now waiting for results queue to drain 7530 1727096020.37612: waiting for pending results... 7530 1727096020.37780: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled 7530 1727096020.37849: in run() - task 0afff68d-5257-086b-f4f0-00000000000e 7530 1727096020.37861: variable 'ansible_search_path' from source: unknown 7530 1727096020.37891: calling self._execute() 7530 1727096020.37964: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.37969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.37979: variable 'omit' from source: magic vars 7530 1727096020.38266: variable 'ansible_distribution_major_version' from source: facts 7530 1727096020.38280: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096020.38287: variable 'omit' from source: magic vars 7530 1727096020.38301: variable 'omit' from source: magic vars 7530 1727096020.38328: variable 'omit' from source: magic vars 7530 1727096020.38360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096020.38390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096020.38406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096020.38418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096020.38429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096020.38453: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096020.38456: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.38460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.38531: Set connection var ansible_pipelining to False 7530 1727096020.38536: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096020.38542: Set connection var ansible_timeout to 10 7530 1727096020.38549: Set connection var ansible_shell_executable to /bin/sh 7530 1727096020.38552: Set connection var ansible_shell_type to sh 7530 1727096020.38554: Set connection var ansible_connection to ssh 7530 1727096020.38574: variable 'ansible_shell_executable' from source: unknown 7530 1727096020.38577: variable 'ansible_connection' from source: unknown 7530 1727096020.38580: variable 'ansible_module_compression' from source: unknown 7530 1727096020.38583: variable 'ansible_shell_type' from source: unknown 7530 1727096020.38585: variable 'ansible_shell_executable' from source: unknown 7530 1727096020.38587: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.38590: variable 'ansible_pipelining' from source: unknown 7530 1727096020.38592: variable 'ansible_timeout' from source: unknown 7530 1727096020.38602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.38697: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096020.38708: variable 'omit' from source: magic vars 7530 1727096020.38712: starting attempt loop 7530 1727096020.38715: running the handler 7530 1727096020.38752: handler run complete 7530 1727096020.38765: attempt loop complete, returning result 7530 1727096020.38769: _execute() done 7530 1727096020.38772: dumping result to json 7530 1727096020.38775: done dumping result, returning 7530 1727096020.38780: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled [0afff68d-5257-086b-f4f0-00000000000e] 7530 1727096020.38784: sending task result for task 0afff68d-5257-086b-f4f0-00000000000e 7530 1727096020.38877: done sending task result for task 0afff68d-5257-086b-f4f0-00000000000e 7530 1727096020.38880: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7530 1727096020.38925: no more pending results, returning what we have 7530 1727096020.38929: results queue empty 7530 1727096020.38929: checking for any_errors_fatal 7530 1727096020.38939: done checking for any_errors_fatal 7530 1727096020.38940: checking for max_fail_percentage 7530 1727096020.38941: done checking for max_fail_percentage 7530 1727096020.38942: checking to see if all hosts have failed and the running result is not ok 7530 1727096020.38943: done checking to see if all hosts have failed 7530 1727096020.38944: getting the remaining hosts for this loop 7530 1727096020.38945: done getting the remaining hosts for this loop 7530 1727096020.38948: getting the next task for host managed_node3 7530 1727096020.38954: done getting next task for host managed_node3 7530 1727096020.38959: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096020.38962: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096020.38978: getting variables 7530 1727096020.38980: in VariableManager get_vars() 7530 1727096020.39023: Calling all_inventory to load vars for managed_node3 7530 1727096020.39026: Calling groups_inventory to load vars for managed_node3 7530 1727096020.39028: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.39036: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.39038: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.39041: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.39165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.39300: done with get_vars() 7530 1727096020.39308: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:53:40 -0400 (0:00:00.019) 0:00:11.182 ****** 7530 1727096020.39377: entering _queue_task() for managed_node3/include_tasks 7530 1727096020.39584: worker is 1 (out of 1 available) 7530 1727096020.39597: exiting _queue_task() for managed_node3/include_tasks 7530 1727096020.39608: done queuing things up, now waiting for results queue to drain 7530 1727096020.39609: waiting for pending results... 7530 1727096020.39771: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096020.39870: in run() - task 0afff68d-5257-086b-f4f0-000000000016 7530 1727096020.39883: variable 'ansible_search_path' from source: unknown 7530 1727096020.39886: variable 'ansible_search_path' from source: unknown 7530 1727096020.39913: calling self._execute() 7530 1727096020.39978: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.39982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.39990: variable 'omit' from source: magic vars 7530 1727096020.40319: variable 'ansible_distribution_major_version' from source: facts 7530 1727096020.40328: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096020.40334: _execute() done 7530 1727096020.40337: dumping result to json 7530 1727096020.40340: done dumping result, returning 7530 1727096020.40347: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-086b-f4f0-000000000016] 7530 1727096020.40352: sending task result for task 0afff68d-5257-086b-f4f0-000000000016 7530 1727096020.40439: done sending task result for task 0afff68d-5257-086b-f4f0-000000000016 7530 1727096020.40442: WORKER PROCESS EXITING 7530 1727096020.40482: no more pending results, returning what we have 7530 1727096020.40486: in VariableManager get_vars() 7530 1727096020.40540: Calling all_inventory to load vars for managed_node3 7530 1727096020.40543: Calling groups_inventory to load vars for managed_node3 7530 1727096020.40545: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.40554: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.40556: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.40559: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.40737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.40857: done with get_vars() 7530 1727096020.40863: variable 'ansible_search_path' from source: unknown 7530 1727096020.40864: variable 'ansible_search_path' from source: unknown 7530 1727096020.40895: we have included files to process 7530 1727096020.40896: generating all_blocks data 7530 1727096020.40898: done generating all_blocks data 7530 1727096020.40901: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096020.40902: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096020.40903: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096020.41372: done processing included file 7530 1727096020.41374: iterating over new_blocks loaded from include file 7530 1727096020.41375: in VariableManager get_vars() 7530 1727096020.41394: done with get_vars() 7530 1727096020.41396: filtering new block on tags 7530 1727096020.41407: done filtering new block on tags 7530 1727096020.41408: in VariableManager get_vars() 7530 1727096020.41426: done with get_vars() 7530 1727096020.41427: filtering new block on tags 7530 1727096020.41438: done filtering new block on tags 7530 1727096020.41440: in VariableManager get_vars() 7530 1727096020.41460: done with get_vars() 7530 1727096020.41461: filtering new block on tags 7530 1727096020.41473: done filtering new block on tags 7530 1727096020.41475: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7530 1727096020.41479: extending task lists for all hosts with included blocks 7530 1727096020.41953: done extending task lists 7530 1727096020.41954: done processing included files 7530 1727096020.41954: results queue empty 7530 1727096020.41955: checking for any_errors_fatal 7530 1727096020.41957: done checking for any_errors_fatal 7530 1727096020.41957: checking for max_fail_percentage 7530 1727096020.41958: done checking for max_fail_percentage 7530 1727096020.41958: checking to see if all hosts have failed and the running result is not ok 7530 1727096020.41959: done checking to see if all hosts have failed 7530 1727096020.41959: getting the remaining hosts for this loop 7530 1727096020.41960: done getting the remaining hosts for this loop 7530 1727096020.41962: getting the next task for host managed_node3 7530 1727096020.41964: done getting next task for host managed_node3 7530 1727096020.41966: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096020.41970: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096020.41977: getting variables 7530 1727096020.41977: in VariableManager get_vars() 7530 1727096020.41992: Calling all_inventory to load vars for managed_node3 7530 1727096020.41994: Calling groups_inventory to load vars for managed_node3 7530 1727096020.41995: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.41999: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.42001: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.42003: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.42089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.42212: done with get_vars() 7530 1727096020.42219: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:53:40 -0400 (0:00:00.028) 0:00:11.211 ****** 7530 1727096020.42271: entering _queue_task() for managed_node3/setup 7530 1727096020.42498: worker is 1 (out of 1 available) 7530 1727096020.42512: exiting _queue_task() for managed_node3/setup 7530 1727096020.42527: done queuing things up, now waiting for results queue to drain 7530 1727096020.42529: waiting for pending results... 7530 1727096020.42696: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096020.42799: in run() - task 0afff68d-5257-086b-f4f0-000000000809 7530 1727096020.42809: variable 'ansible_search_path' from source: unknown 7530 1727096020.42813: variable 'ansible_search_path' from source: unknown 7530 1727096020.42842: calling self._execute() 7530 1727096020.42909: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.42912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.42924: variable 'omit' from source: magic vars 7530 1727096020.43199: variable 'ansible_distribution_major_version' from source: facts 7530 1727096020.43203: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096020.43378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096020.44825: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096020.44878: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096020.44905: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096020.44932: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096020.44955: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096020.45014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096020.45035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096020.45054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096020.45084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096020.45095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096020.45136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096020.45152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096020.45177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096020.45201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096020.45211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096020.45319: variable '__network_required_facts' from source: role '' defaults 7530 1727096020.45329: variable 'ansible_facts' from source: unknown 7530 1727096020.45393: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7530 1727096020.45398: when evaluation is False, skipping this task 7530 1727096020.45401: _execute() done 7530 1727096020.45403: dumping result to json 7530 1727096020.45405: done dumping result, returning 7530 1727096020.45412: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-086b-f4f0-000000000809] 7530 1727096020.45417: sending task result for task 0afff68d-5257-086b-f4f0-000000000809 7530 1727096020.45510: done sending task result for task 0afff68d-5257-086b-f4f0-000000000809 7530 1727096020.45513: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096020.45555: no more pending results, returning what we have 7530 1727096020.45559: results queue empty 7530 1727096020.45559: checking for any_errors_fatal 7530 1727096020.45561: done checking for any_errors_fatal 7530 1727096020.45561: checking for max_fail_percentage 7530 1727096020.45563: done checking for max_fail_percentage 7530 1727096020.45564: checking to see if all hosts have failed and the running result is not ok 7530 1727096020.45565: done checking to see if all hosts have failed 7530 1727096020.45565: getting the remaining hosts for this loop 7530 1727096020.45567: done getting the remaining hosts for this loop 7530 1727096020.45572: getting the next task for host managed_node3 7530 1727096020.45580: done getting next task for host managed_node3 7530 1727096020.45583: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096020.45587: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096020.45599: getting variables 7530 1727096020.45601: in VariableManager get_vars() 7530 1727096020.45650: Calling all_inventory to load vars for managed_node3 7530 1727096020.45653: Calling groups_inventory to load vars for managed_node3 7530 1727096020.45655: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.45664: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.45666: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.45677: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.45843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.45982: done with get_vars() 7530 1727096020.45989: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:53:40 -0400 (0:00:00.037) 0:00:11.248 ****** 7530 1727096020.46067: entering _queue_task() for managed_node3/stat 7530 1727096020.46277: worker is 1 (out of 1 available) 7530 1727096020.46289: exiting _queue_task() for managed_node3/stat 7530 1727096020.46300: done queuing things up, now waiting for results queue to drain 7530 1727096020.46302: waiting for pending results... 7530 1727096020.46471: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096020.46572: in run() - task 0afff68d-5257-086b-f4f0-00000000080b 7530 1727096020.46583: variable 'ansible_search_path' from source: unknown 7530 1727096020.46588: variable 'ansible_search_path' from source: unknown 7530 1727096020.46617: calling self._execute() 7530 1727096020.46682: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.46687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.46695: variable 'omit' from source: magic vars 7530 1727096020.46972: variable 'ansible_distribution_major_version' from source: facts 7530 1727096020.46983: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096020.47099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096020.47295: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096020.47327: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096020.47353: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096020.47380: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096020.47468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096020.47487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096020.47508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096020.47574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096020.47593: variable '__network_is_ostree' from source: set_fact 7530 1727096020.47600: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096020.47603: when evaluation is False, skipping this task 7530 1727096020.47605: _execute() done 7530 1727096020.47608: dumping result to json 7530 1727096020.47612: done dumping result, returning 7530 1727096020.47672: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-086b-f4f0-00000000080b] 7530 1727096020.47675: sending task result for task 0afff68d-5257-086b-f4f0-00000000080b 7530 1727096020.47737: done sending task result for task 0afff68d-5257-086b-f4f0-00000000080b 7530 1727096020.47739: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096020.47795: no more pending results, returning what we have 7530 1727096020.47799: results queue empty 7530 1727096020.47800: checking for any_errors_fatal 7530 1727096020.47807: done checking for any_errors_fatal 7530 1727096020.47807: checking for max_fail_percentage 7530 1727096020.47809: done checking for max_fail_percentage 7530 1727096020.47810: checking to see if all hosts have failed and the running result is not ok 7530 1727096020.47811: done checking to see if all hosts have failed 7530 1727096020.47811: getting the remaining hosts for this loop 7530 1727096020.47813: done getting the remaining hosts for this loop 7530 1727096020.47816: getting the next task for host managed_node3 7530 1727096020.47821: done getting next task for host managed_node3 7530 1727096020.47825: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096020.47828: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096020.47841: getting variables 7530 1727096020.47842: in VariableManager get_vars() 7530 1727096020.47886: Calling all_inventory to load vars for managed_node3 7530 1727096020.47889: Calling groups_inventory to load vars for managed_node3 7530 1727096020.47891: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.47900: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.47902: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.47909: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.48024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.48150: done with get_vars() 7530 1727096020.48158: done getting variables 7530 1727096020.48199: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:53:40 -0400 (0:00:00.021) 0:00:11.270 ****** 7530 1727096020.48225: entering _queue_task() for managed_node3/set_fact 7530 1727096020.48424: worker is 1 (out of 1 available) 7530 1727096020.48434: exiting _queue_task() for managed_node3/set_fact 7530 1727096020.48446: done queuing things up, now waiting for results queue to drain 7530 1727096020.48448: waiting for pending results... 7530 1727096020.48615: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096020.48718: in run() - task 0afff68d-5257-086b-f4f0-00000000080c 7530 1727096020.48733: variable 'ansible_search_path' from source: unknown 7530 1727096020.48736: variable 'ansible_search_path' from source: unknown 7530 1727096020.48763: calling self._execute() 7530 1727096020.48830: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.48833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.48842: variable 'omit' from source: magic vars 7530 1727096020.49170: variable 'ansible_distribution_major_version' from source: facts 7530 1727096020.49180: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096020.49295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096020.49491: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096020.49522: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096020.49548: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096020.49580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096020.49642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096020.49664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096020.49685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096020.49704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096020.49766: variable '__network_is_ostree' from source: set_fact 7530 1727096020.49780: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096020.49783: when evaluation is False, skipping this task 7530 1727096020.49786: _execute() done 7530 1727096020.49788: dumping result to json 7530 1727096020.49791: done dumping result, returning 7530 1727096020.49799: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-086b-f4f0-00000000080c] 7530 1727096020.49804: sending task result for task 0afff68d-5257-086b-f4f0-00000000080c 7530 1727096020.49887: done sending task result for task 0afff68d-5257-086b-f4f0-00000000080c 7530 1727096020.49890: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096020.49933: no more pending results, returning what we have 7530 1727096020.49937: results queue empty 7530 1727096020.49938: checking for any_errors_fatal 7530 1727096020.49943: done checking for any_errors_fatal 7530 1727096020.49944: checking for max_fail_percentage 7530 1727096020.49946: done checking for max_fail_percentage 7530 1727096020.49946: checking to see if all hosts have failed and the running result is not ok 7530 1727096020.49948: done checking to see if all hosts have failed 7530 1727096020.49948: getting the remaining hosts for this loop 7530 1727096020.49950: done getting the remaining hosts for this loop 7530 1727096020.49953: getting the next task for host managed_node3 7530 1727096020.49962: done getting next task for host managed_node3 7530 1727096020.49966: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096020.49972: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096020.49988: getting variables 7530 1727096020.49989: in VariableManager get_vars() 7530 1727096020.50032: Calling all_inventory to load vars for managed_node3 7530 1727096020.50035: Calling groups_inventory to load vars for managed_node3 7530 1727096020.50037: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096020.50045: Calling all_plugins_play to load vars for managed_node3 7530 1727096020.50047: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096020.50049: Calling groups_plugins_play to load vars for managed_node3 7530 1727096020.50221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096020.50346: done with get_vars() 7530 1727096020.50354: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:53:40 -0400 (0:00:00.021) 0:00:11.292 ****** 7530 1727096020.50423: entering _queue_task() for managed_node3/service_facts 7530 1727096020.50424: Creating lock for service_facts 7530 1727096020.50630: worker is 1 (out of 1 available) 7530 1727096020.50644: exiting _queue_task() for managed_node3/service_facts 7530 1727096020.50656: done queuing things up, now waiting for results queue to drain 7530 1727096020.50657: waiting for pending results... 7530 1727096020.50826: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096020.50932: in run() - task 0afff68d-5257-086b-f4f0-00000000080e 7530 1727096020.50943: variable 'ansible_search_path' from source: unknown 7530 1727096020.50947: variable 'ansible_search_path' from source: unknown 7530 1727096020.50977: calling self._execute() 7530 1727096020.51042: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.51046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.51055: variable 'omit' from source: magic vars 7530 1727096020.51327: variable 'ansible_distribution_major_version' from source: facts 7530 1727096020.51338: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096020.51344: variable 'omit' from source: magic vars 7530 1727096020.51391: variable 'omit' from source: magic vars 7530 1727096020.51415: variable 'omit' from source: magic vars 7530 1727096020.51451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096020.51479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096020.51494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096020.51507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096020.51516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096020.51545: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096020.51548: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.51551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.51618: Set connection var ansible_pipelining to False 7530 1727096020.51626: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096020.51631: Set connection var ansible_timeout to 10 7530 1727096020.51639: Set connection var ansible_shell_executable to /bin/sh 7530 1727096020.51644: Set connection var ansible_shell_type to sh 7530 1727096020.51646: Set connection var ansible_connection to ssh 7530 1727096020.51666: variable 'ansible_shell_executable' from source: unknown 7530 1727096020.51670: variable 'ansible_connection' from source: unknown 7530 1727096020.51673: variable 'ansible_module_compression' from source: unknown 7530 1727096020.51675: variable 'ansible_shell_type' from source: unknown 7530 1727096020.51678: variable 'ansible_shell_executable' from source: unknown 7530 1727096020.51680: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096020.51682: variable 'ansible_pipelining' from source: unknown 7530 1727096020.51685: variable 'ansible_timeout' from source: unknown 7530 1727096020.51689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096020.51837: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096020.51846: variable 'omit' from source: magic vars 7530 1727096020.51851: starting attempt loop 7530 1727096020.51854: running the handler 7530 1727096020.51868: _low_level_execute_command(): starting 7530 1727096020.51879: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096020.52391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.52395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096020.52398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096020.52401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.52455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.52458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.52460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.52508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.54191: stdout chunk (state=3): >>>/root <<< 7530 1727096020.54284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.54315: stderr chunk (state=3): >>><<< 7530 1727096020.54319: stdout chunk (state=3): >>><<< 7530 1727096020.54339: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096020.54352: _low_level_execute_command(): starting 7530 1727096020.54364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640 `" && echo ansible-tmp-1727096020.5433948-8044-51588700730640="` echo /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640 `" ) && sleep 0' 7530 1727096020.54817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.54821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.54842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.54893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.54899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.54903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.54941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.56916: stdout chunk (state=3): >>>ansible-tmp-1727096020.5433948-8044-51588700730640=/root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640 <<< 7530 1727096020.57014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.57044: stderr chunk (state=3): >>><<< 7530 1727096020.57049: stdout chunk (state=3): >>><<< 7530 1727096020.57070: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096020.5433948-8044-51588700730640=/root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096020.57111: variable 'ansible_module_compression' from source: unknown 7530 1727096020.57148: ANSIBALLZ: Using lock for service_facts 7530 1727096020.57152: ANSIBALLZ: Acquiring lock 7530 1727096020.57154: ANSIBALLZ: Lock acquired: 139837168242896 7530 1727096020.57156: ANSIBALLZ: Creating module 7530 1727096020.65578: ANSIBALLZ: Writing module into payload 7530 1727096020.65642: ANSIBALLZ: Writing module 7530 1727096020.65664: ANSIBALLZ: Renaming module 7530 1727096020.65670: ANSIBALLZ: Done creating module 7530 1727096020.65687: variable 'ansible_facts' from source: unknown 7530 1727096020.65736: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/AnsiballZ_service_facts.py 7530 1727096020.65841: Sending initial data 7530 1727096020.65845: Sent initial data (159 bytes) 7530 1727096020.66316: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.66320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.66322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096020.66326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096020.66329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.66374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.66386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.66434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.68107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096020.68136: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096020.68178: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp8fe4lifp /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/AnsiballZ_service_facts.py <<< 7530 1727096020.68181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/AnsiballZ_service_facts.py" <<< 7530 1727096020.68199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp8fe4lifp" to remote "/root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/AnsiballZ_service_facts.py" <<< 7530 1727096020.68204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/AnsiballZ_service_facts.py" <<< 7530 1727096020.68717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.68766: stderr chunk (state=3): >>><<< 7530 1727096020.68771: stdout chunk (state=3): >>><<< 7530 1727096020.68839: done transferring module to remote 7530 1727096020.68845: _low_level_execute_command(): starting 7530 1727096020.68850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/ /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/AnsiballZ_service_facts.py && sleep 0' 7530 1727096020.69309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.69313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096020.69315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.69324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.69326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.69375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.69379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.69386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.69419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096020.71286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096020.71310: stderr chunk (state=3): >>><<< 7530 1727096020.71315: stdout chunk (state=3): >>><<< 7530 1727096020.71332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096020.71335: _low_level_execute_command(): starting 7530 1727096020.71340: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/AnsiballZ_service_facts.py && sleep 0' 7530 1727096020.71794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.71799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.71801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096020.71803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096020.71856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096020.71859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096020.71872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096020.71905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096022.49275: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 7530 1727096022.49294: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 7530 1727096022.49333: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service"<<< 7530 1727096022.49340: stdout chunk (state=3): >>>, "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@<<< 7530 1727096022.49345: stdout chunk (state=3): >>>.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7530 1727096022.51050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096022.51082: stderr chunk (state=3): >>><<< 7530 1727096022.51089: stdout chunk (state=3): >>><<< 7530 1727096022.51108: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096022.51476: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096022.51485: _low_level_execute_command(): starting 7530 1727096022.51488: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096020.5433948-8044-51588700730640/ > /dev/null 2>&1 && sleep 0' 7530 1727096022.51958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096022.51961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096022.51963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096022.51966: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096022.51971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.52017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096022.52020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096022.52022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096022.52067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096022.53957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096022.53985: stderr chunk (state=3): >>><<< 7530 1727096022.53988: stdout chunk (state=3): >>><<< 7530 1727096022.54006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096022.54009: handler run complete 7530 1727096022.54121: variable 'ansible_facts' from source: unknown 7530 1727096022.54222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096022.54476: variable 'ansible_facts' from source: unknown 7530 1727096022.55320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096022.55437: attempt loop complete, returning result 7530 1727096022.55441: _execute() done 7530 1727096022.55443: dumping result to json 7530 1727096022.55479: done dumping result, returning 7530 1727096022.55487: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-086b-f4f0-00000000080e] 7530 1727096022.55491: sending task result for task 0afff68d-5257-086b-f4f0-00000000080e ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096022.56054: no more pending results, returning what we have 7530 1727096022.56056: results queue empty 7530 1727096022.56057: checking for any_errors_fatal 7530 1727096022.56061: done checking for any_errors_fatal 7530 1727096022.56062: checking for max_fail_percentage 7530 1727096022.56063: done checking for max_fail_percentage 7530 1727096022.56064: checking to see if all hosts have failed and the running result is not ok 7530 1727096022.56065: done checking to see if all hosts have failed 7530 1727096022.56065: getting the remaining hosts for this loop 7530 1727096022.56067: done getting the remaining hosts for this loop 7530 1727096022.56072: getting the next task for host managed_node3 7530 1727096022.56084: done getting next task for host managed_node3 7530 1727096022.56088: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096022.56091: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096022.56099: done sending task result for task 0afff68d-5257-086b-f4f0-00000000080e 7530 1727096022.56102: WORKER PROCESS EXITING 7530 1727096022.56108: getting variables 7530 1727096022.56109: in VariableManager get_vars() 7530 1727096022.56142: Calling all_inventory to load vars for managed_node3 7530 1727096022.56144: Calling groups_inventory to load vars for managed_node3 7530 1727096022.56145: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096022.56152: Calling all_plugins_play to load vars for managed_node3 7530 1727096022.56153: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096022.56155: Calling groups_plugins_play to load vars for managed_node3 7530 1727096022.56390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096022.56673: done with get_vars() 7530 1727096022.56684: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:53:42 -0400 (0:00:02.063) 0:00:13.355 ****** 7530 1727096022.56758: entering _queue_task() for managed_node3/package_facts 7530 1727096022.56760: Creating lock for package_facts 7530 1727096022.56984: worker is 1 (out of 1 available) 7530 1727096022.56996: exiting _queue_task() for managed_node3/package_facts 7530 1727096022.57008: done queuing things up, now waiting for results queue to drain 7530 1727096022.57010: waiting for pending results... 7530 1727096022.57184: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096022.57295: in run() - task 0afff68d-5257-086b-f4f0-00000000080f 7530 1727096022.57307: variable 'ansible_search_path' from source: unknown 7530 1727096022.57310: variable 'ansible_search_path' from source: unknown 7530 1727096022.57341: calling self._execute() 7530 1727096022.57404: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096022.57409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096022.57418: variable 'omit' from source: magic vars 7530 1727096022.57691: variable 'ansible_distribution_major_version' from source: facts 7530 1727096022.57702: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096022.57707: variable 'omit' from source: magic vars 7530 1727096022.57754: variable 'omit' from source: magic vars 7530 1727096022.57780: variable 'omit' from source: magic vars 7530 1727096022.57815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096022.57843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096022.57859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096022.57874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096022.57883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096022.57910: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096022.57913: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096022.57915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096022.57985: Set connection var ansible_pipelining to False 7530 1727096022.57990: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096022.57996: Set connection var ansible_timeout to 10 7530 1727096022.58007: Set connection var ansible_shell_executable to /bin/sh 7530 1727096022.58011: Set connection var ansible_shell_type to sh 7530 1727096022.58013: Set connection var ansible_connection to ssh 7530 1727096022.58036: variable 'ansible_shell_executable' from source: unknown 7530 1727096022.58039: variable 'ansible_connection' from source: unknown 7530 1727096022.58042: variable 'ansible_module_compression' from source: unknown 7530 1727096022.58044: variable 'ansible_shell_type' from source: unknown 7530 1727096022.58047: variable 'ansible_shell_executable' from source: unknown 7530 1727096022.58049: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096022.58051: variable 'ansible_pipelining' from source: unknown 7530 1727096022.58053: variable 'ansible_timeout' from source: unknown 7530 1727096022.58058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096022.58205: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096022.58213: variable 'omit' from source: magic vars 7530 1727096022.58225: starting attempt loop 7530 1727096022.58229: running the handler 7530 1727096022.58238: _low_level_execute_command(): starting 7530 1727096022.58245: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096022.58771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096022.58777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096022.58780: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096022.58783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.58833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096022.58836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096022.58883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096022.60573: stdout chunk (state=3): >>>/root <<< 7530 1727096022.60662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096022.60696: stderr chunk (state=3): >>><<< 7530 1727096022.60699: stdout chunk (state=3): >>><<< 7530 1727096022.60719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096022.60773: _low_level_execute_command(): starting 7530 1727096022.60776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453 `" && echo ansible-tmp-1727096022.6071973-8084-255757670390453="` echo /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453 `" ) && sleep 0' 7530 1727096022.61204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096022.61208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.61210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096022.61220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.61273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096022.61283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096022.61289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096022.61318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096022.63303: stdout chunk (state=3): >>>ansible-tmp-1727096022.6071973-8084-255757670390453=/root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453 <<< 7530 1727096022.63403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096022.63436: stderr chunk (state=3): >>><<< 7530 1727096022.63440: stdout chunk (state=3): >>><<< 7530 1727096022.63456: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096022.6071973-8084-255757670390453=/root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096022.63498: variable 'ansible_module_compression' from source: unknown 7530 1727096022.63544: ANSIBALLZ: Using lock for package_facts 7530 1727096022.63547: ANSIBALLZ: Acquiring lock 7530 1727096022.63550: ANSIBALLZ: Lock acquired: 139837163172400 7530 1727096022.63552: ANSIBALLZ: Creating module 7530 1727096022.82407: ANSIBALLZ: Writing module into payload 7530 1727096022.82496: ANSIBALLZ: Writing module 7530 1727096022.82520: ANSIBALLZ: Renaming module 7530 1727096022.82526: ANSIBALLZ: Done creating module 7530 1727096022.82545: variable 'ansible_facts' from source: unknown 7530 1727096022.82665: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/AnsiballZ_package_facts.py 7530 1727096022.82774: Sending initial data 7530 1727096022.82777: Sent initial data (160 bytes) 7530 1727096022.83221: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096022.83224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.83244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096022.83247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.83303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096022.83306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096022.83309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096022.83357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096022.85012: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096022.85032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096022.85064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp68in5ibs /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/AnsiballZ_package_facts.py <<< 7530 1727096022.85084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/AnsiballZ_package_facts.py" <<< 7530 1727096022.85089: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp68in5ibs" to remote "/root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/AnsiballZ_package_facts.py" <<< 7530 1727096022.85096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/AnsiballZ_package_facts.py" <<< 7530 1727096022.86125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096022.86246: stderr chunk (state=3): >>><<< 7530 1727096022.86249: stdout chunk (state=3): >>><<< 7530 1727096022.86251: done transferring module to remote 7530 1727096022.86254: _low_level_execute_command(): starting 7530 1727096022.86256: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/ /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/AnsiballZ_package_facts.py && sleep 0' 7530 1727096022.86675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096022.86680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096022.86682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.86684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096022.86692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096022.86694: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.86759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096022.86778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096022.86815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096022.86891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096022.88748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096022.88762: stderr chunk (state=3): >>><<< 7530 1727096022.88774: stdout chunk (state=3): >>><<< 7530 1727096022.88798: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096022.88864: _low_level_execute_command(): starting 7530 1727096022.88870: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/AnsiballZ_package_facts.py && sleep 0' 7530 1727096022.89298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096022.89302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096022.89304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096022.89306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096022.89308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096022.89352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096022.89357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096022.89401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096023.34583: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 7530 1727096023.34606: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 7530 1727096023.34648: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 7530 1727096023.34723: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7530 1727096023.36510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096023.36517: stdout chunk (state=3): >>><<< 7530 1727096023.36519: stderr chunk (state=3): >>><<< 7530 1727096023.36628: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096023.39163: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096023.39169: _low_level_execute_command(): starting 7530 1727096023.39172: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096022.6071973-8084-255757670390453/ > /dev/null 2>&1 && sleep 0' 7530 1727096023.39681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096023.39696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096023.39712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096023.39731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096023.39788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096023.39844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096023.39860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096023.39885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096023.39961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096023.41889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096023.41894: stdout chunk (state=3): >>><<< 7530 1727096023.41896: stderr chunk (state=3): >>><<< 7530 1727096023.41920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096023.41932: handler run complete 7530 1727096023.48185: variable 'ansible_facts' from source: unknown 7530 1727096023.48875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.50494: variable 'ansible_facts' from source: unknown 7530 1727096023.50911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.51602: attempt loop complete, returning result 7530 1727096023.51628: _execute() done 7530 1727096023.51637: dumping result to json 7530 1727096023.51851: done dumping result, returning 7530 1727096023.51869: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-086b-f4f0-00000000080f] 7530 1727096023.51879: sending task result for task 0afff68d-5257-086b-f4f0-00000000080f 7530 1727096023.54408: done sending task result for task 0afff68d-5257-086b-f4f0-00000000080f 7530 1727096023.54411: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096023.54504: no more pending results, returning what we have 7530 1727096023.54507: results queue empty 7530 1727096023.54508: checking for any_errors_fatal 7530 1727096023.54512: done checking for any_errors_fatal 7530 1727096023.54512: checking for max_fail_percentage 7530 1727096023.54514: done checking for max_fail_percentage 7530 1727096023.54515: checking to see if all hosts have failed and the running result is not ok 7530 1727096023.54516: done checking to see if all hosts have failed 7530 1727096023.54516: getting the remaining hosts for this loop 7530 1727096023.54518: done getting the remaining hosts for this loop 7530 1727096023.54521: getting the next task for host managed_node3 7530 1727096023.54527: done getting next task for host managed_node3 7530 1727096023.54530: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096023.54533: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096023.54542: getting variables 7530 1727096023.54544: in VariableManager get_vars() 7530 1727096023.54585: Calling all_inventory to load vars for managed_node3 7530 1727096023.54587: Calling groups_inventory to load vars for managed_node3 7530 1727096023.54590: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096023.54599: Calling all_plugins_play to load vars for managed_node3 7530 1727096023.54602: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096023.54605: Calling groups_plugins_play to load vars for managed_node3 7530 1727096023.55998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.58734: done with get_vars() 7530 1727096023.58766: done getting variables 7530 1727096023.58833: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:53:43 -0400 (0:00:01.021) 0:00:14.379 ****** 7530 1727096023.59077: entering _queue_task() for managed_node3/debug 7530 1727096023.59517: worker is 1 (out of 1 available) 7530 1727096023.59533: exiting _queue_task() for managed_node3/debug 7530 1727096023.59546: done queuing things up, now waiting for results queue to drain 7530 1727096023.59548: waiting for pending results... 7530 1727096023.59887: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096023.59946: in run() - task 0afff68d-5257-086b-f4f0-000000000017 7530 1727096023.59961: variable 'ansible_search_path' from source: unknown 7530 1727096023.59965: variable 'ansible_search_path' from source: unknown 7530 1727096023.60005: calling self._execute() 7530 1727096023.60108: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.60123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.60138: variable 'omit' from source: magic vars 7530 1727096023.60542: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.60629: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096023.60632: variable 'omit' from source: magic vars 7530 1727096023.60635: variable 'omit' from source: magic vars 7530 1727096023.60727: variable 'network_provider' from source: set_fact 7530 1727096023.60754: variable 'omit' from source: magic vars 7530 1727096023.60803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096023.60855: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096023.60886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096023.60910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096023.60932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096023.60975: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096023.61063: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.61069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.61114: Set connection var ansible_pipelining to False 7530 1727096023.61130: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096023.61142: Set connection var ansible_timeout to 10 7530 1727096023.61155: Set connection var ansible_shell_executable to /bin/sh 7530 1727096023.61162: Set connection var ansible_shell_type to sh 7530 1727096023.61175: Set connection var ansible_connection to ssh 7530 1727096023.61209: variable 'ansible_shell_executable' from source: unknown 7530 1727096023.61217: variable 'ansible_connection' from source: unknown 7530 1727096023.61227: variable 'ansible_module_compression' from source: unknown 7530 1727096023.61234: variable 'ansible_shell_type' from source: unknown 7530 1727096023.61240: variable 'ansible_shell_executable' from source: unknown 7530 1727096023.61247: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.61254: variable 'ansible_pipelining' from source: unknown 7530 1727096023.61260: variable 'ansible_timeout' from source: unknown 7530 1727096023.61269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.61660: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096023.61735: variable 'omit' from source: magic vars 7530 1727096023.61738: starting attempt loop 7530 1727096023.61741: running the handler 7530 1727096023.61830: handler run complete 7530 1727096023.61833: attempt loop complete, returning result 7530 1727096023.61835: _execute() done 7530 1727096023.61837: dumping result to json 7530 1727096023.61839: done dumping result, returning 7530 1727096023.61860: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-086b-f4f0-000000000017] 7530 1727096023.61916: sending task result for task 0afff68d-5257-086b-f4f0-000000000017 7530 1727096023.62218: done sending task result for task 0afff68d-5257-086b-f4f0-000000000017 7530 1727096023.62224: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7530 1727096023.62290: no more pending results, returning what we have 7530 1727096023.62294: results queue empty 7530 1727096023.62295: checking for any_errors_fatal 7530 1727096023.62305: done checking for any_errors_fatal 7530 1727096023.62306: checking for max_fail_percentage 7530 1727096023.62308: done checking for max_fail_percentage 7530 1727096023.62309: checking to see if all hosts have failed and the running result is not ok 7530 1727096023.62310: done checking to see if all hosts have failed 7530 1727096023.62311: getting the remaining hosts for this loop 7530 1727096023.62312: done getting the remaining hosts for this loop 7530 1727096023.62316: getting the next task for host managed_node3 7530 1727096023.62326: done getting next task for host managed_node3 7530 1727096023.62330: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096023.62334: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096023.62344: getting variables 7530 1727096023.62346: in VariableManager get_vars() 7530 1727096023.62404: Calling all_inventory to load vars for managed_node3 7530 1727096023.62407: Calling groups_inventory to load vars for managed_node3 7530 1727096023.62410: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096023.62424: Calling all_plugins_play to load vars for managed_node3 7530 1727096023.62427: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096023.62430: Calling groups_plugins_play to load vars for managed_node3 7530 1727096023.64283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.66349: done with get_vars() 7530 1727096023.66388: done getting variables 7530 1727096023.66453: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:53:43 -0400 (0:00:00.075) 0:00:14.454 ****** 7530 1727096023.66607: entering _queue_task() for managed_node3/fail 7530 1727096023.67229: worker is 1 (out of 1 available) 7530 1727096023.67355: exiting _queue_task() for managed_node3/fail 7530 1727096023.67371: done queuing things up, now waiting for results queue to drain 7530 1727096023.67373: waiting for pending results... 7530 1727096023.67894: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096023.68000: in run() - task 0afff68d-5257-086b-f4f0-000000000018 7530 1727096023.68031: variable 'ansible_search_path' from source: unknown 7530 1727096023.68043: variable 'ansible_search_path' from source: unknown 7530 1727096023.68087: calling self._execute() 7530 1727096023.68201: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.68229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.68235: variable 'omit' from source: magic vars 7530 1727096023.68733: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.68737: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096023.68886: variable 'network_state' from source: role '' defaults 7530 1727096023.68903: Evaluated conditional (network_state != {}): False 7530 1727096023.68917: when evaluation is False, skipping this task 7530 1727096023.68925: _execute() done 7530 1727096023.68991: dumping result to json 7530 1727096023.68994: done dumping result, returning 7530 1727096023.68998: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-086b-f4f0-000000000018] 7530 1727096023.69001: sending task result for task 0afff68d-5257-086b-f4f0-000000000018 7530 1727096023.69081: done sending task result for task 0afff68d-5257-086b-f4f0-000000000018 7530 1727096023.69084: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096023.69139: no more pending results, returning what we have 7530 1727096023.69143: results queue empty 7530 1727096023.69144: checking for any_errors_fatal 7530 1727096023.69151: done checking for any_errors_fatal 7530 1727096023.69152: checking for max_fail_percentage 7530 1727096023.69154: done checking for max_fail_percentage 7530 1727096023.69154: checking to see if all hosts have failed and the running result is not ok 7530 1727096023.69155: done checking to see if all hosts have failed 7530 1727096023.69156: getting the remaining hosts for this loop 7530 1727096023.69158: done getting the remaining hosts for this loop 7530 1727096023.69162: getting the next task for host managed_node3 7530 1727096023.69171: done getting next task for host managed_node3 7530 1727096023.69176: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096023.69179: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096023.69195: getting variables 7530 1727096023.69196: in VariableManager get_vars() 7530 1727096023.69440: Calling all_inventory to load vars for managed_node3 7530 1727096023.69443: Calling groups_inventory to load vars for managed_node3 7530 1727096023.69446: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096023.69581: Calling all_plugins_play to load vars for managed_node3 7530 1727096023.69586: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096023.69590: Calling groups_plugins_play to load vars for managed_node3 7530 1727096023.72245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.74188: done with get_vars() 7530 1727096023.74220: done getting variables 7530 1727096023.74289: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:53:43 -0400 (0:00:00.077) 0:00:14.531 ****** 7530 1727096023.74329: entering _queue_task() for managed_node3/fail 7530 1727096023.74708: worker is 1 (out of 1 available) 7530 1727096023.74839: exiting _queue_task() for managed_node3/fail 7530 1727096023.74852: done queuing things up, now waiting for results queue to drain 7530 1727096023.74854: waiting for pending results... 7530 1727096023.75281: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096023.75286: in run() - task 0afff68d-5257-086b-f4f0-000000000019 7530 1727096023.75305: variable 'ansible_search_path' from source: unknown 7530 1727096023.75314: variable 'ansible_search_path' from source: unknown 7530 1727096023.75365: calling self._execute() 7530 1727096023.75475: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.75497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.75511: variable 'omit' from source: magic vars 7530 1727096023.75974: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.75978: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096023.76099: variable 'network_state' from source: role '' defaults 7530 1727096023.76114: Evaluated conditional (network_state != {}): False 7530 1727096023.76124: when evaluation is False, skipping this task 7530 1727096023.76140: _execute() done 7530 1727096023.76144: dumping result to json 7530 1727096023.76173: done dumping result, returning 7530 1727096023.76176: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-086b-f4f0-000000000019] 7530 1727096023.76179: sending task result for task 0afff68d-5257-086b-f4f0-000000000019 7530 1727096023.76414: done sending task result for task 0afff68d-5257-086b-f4f0-000000000019 7530 1727096023.76417: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096023.76478: no more pending results, returning what we have 7530 1727096023.76482: results queue empty 7530 1727096023.76483: checking for any_errors_fatal 7530 1727096023.76489: done checking for any_errors_fatal 7530 1727096023.76490: checking for max_fail_percentage 7530 1727096023.76492: done checking for max_fail_percentage 7530 1727096023.76493: checking to see if all hosts have failed and the running result is not ok 7530 1727096023.76494: done checking to see if all hosts have failed 7530 1727096023.76495: getting the remaining hosts for this loop 7530 1727096023.76496: done getting the remaining hosts for this loop 7530 1727096023.76501: getting the next task for host managed_node3 7530 1727096023.76509: done getting next task for host managed_node3 7530 1727096023.76513: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096023.76516: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096023.76581: getting variables 7530 1727096023.76583: in VariableManager get_vars() 7530 1727096023.76776: Calling all_inventory to load vars for managed_node3 7530 1727096023.76780: Calling groups_inventory to load vars for managed_node3 7530 1727096023.76782: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096023.76794: Calling all_plugins_play to load vars for managed_node3 7530 1727096023.76797: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096023.76830: Calling groups_plugins_play to load vars for managed_node3 7530 1727096023.78192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.79937: done with get_vars() 7530 1727096023.79965: done getting variables 7530 1727096023.80038: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:53:43 -0400 (0:00:00.057) 0:00:14.589 ****** 7530 1727096023.80074: entering _queue_task() for managed_node3/fail 7530 1727096023.80430: worker is 1 (out of 1 available) 7530 1727096023.80678: exiting _queue_task() for managed_node3/fail 7530 1727096023.80688: done queuing things up, now waiting for results queue to drain 7530 1727096023.80690: waiting for pending results... 7530 1727096023.80932: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096023.80944: in run() - task 0afff68d-5257-086b-f4f0-00000000001a 7530 1727096023.80964: variable 'ansible_search_path' from source: unknown 7530 1727096023.80975: variable 'ansible_search_path' from source: unknown 7530 1727096023.81032: calling self._execute() 7530 1727096023.81135: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.81149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.81185: variable 'omit' from source: magic vars 7530 1727096023.81607: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.81629: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096023.81850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096023.84265: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096023.84359: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096023.84408: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096023.84453: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096023.84507: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096023.84581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.84619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.84773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.84777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.84780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.84832: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.84854: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7530 1727096023.84982: variable 'ansible_distribution' from source: facts 7530 1727096023.84992: variable '__network_rh_distros' from source: role '' defaults 7530 1727096023.85011: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7530 1727096023.85275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.85303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.85339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.85385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.85405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.85547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.85550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.85552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.85559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.85582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.85630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.85664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.85697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.85743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.85770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.86091: variable 'network_connections' from source: task vars 7530 1727096023.86109: variable 'interface' from source: play vars 7530 1727096023.86189: variable 'interface' from source: play vars 7530 1727096023.86214: variable 'network_state' from source: role '' defaults 7530 1727096023.86285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096023.86467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096023.86503: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096023.86529: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096023.86553: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096023.86587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096023.86603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096023.86630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.86653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096023.86683: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7530 1727096023.86687: when evaluation is False, skipping this task 7530 1727096023.86689: _execute() done 7530 1727096023.86691: dumping result to json 7530 1727096023.86694: done dumping result, returning 7530 1727096023.86701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-086b-f4f0-00000000001a] 7530 1727096023.86704: sending task result for task 0afff68d-5257-086b-f4f0-00000000001a 7530 1727096023.86793: done sending task result for task 0afff68d-5257-086b-f4f0-00000000001a 7530 1727096023.86795: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7530 1727096023.86841: no more pending results, returning what we have 7530 1727096023.86844: results queue empty 7530 1727096023.86845: checking for any_errors_fatal 7530 1727096023.86851: done checking for any_errors_fatal 7530 1727096023.86851: checking for max_fail_percentage 7530 1727096023.86853: done checking for max_fail_percentage 7530 1727096023.86854: checking to see if all hosts have failed and the running result is not ok 7530 1727096023.86855: done checking to see if all hosts have failed 7530 1727096023.86855: getting the remaining hosts for this loop 7530 1727096023.86857: done getting the remaining hosts for this loop 7530 1727096023.86861: getting the next task for host managed_node3 7530 1727096023.86869: done getting next task for host managed_node3 7530 1727096023.86873: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096023.86875: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096023.86888: getting variables 7530 1727096023.86889: in VariableManager get_vars() 7530 1727096023.86938: Calling all_inventory to load vars for managed_node3 7530 1727096023.86941: Calling groups_inventory to load vars for managed_node3 7530 1727096023.86943: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096023.86953: Calling all_plugins_play to load vars for managed_node3 7530 1727096023.86955: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096023.86958: Calling groups_plugins_play to load vars for managed_node3 7530 1727096023.87729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.89096: done with get_vars() 7530 1727096023.89119: done getting variables 7530 1727096023.89234: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:53:43 -0400 (0:00:00.091) 0:00:14.681 ****** 7530 1727096023.89277: entering _queue_task() for managed_node3/dnf 7530 1727096023.89549: worker is 1 (out of 1 available) 7530 1727096023.89562: exiting _queue_task() for managed_node3/dnf 7530 1727096023.89580: done queuing things up, now waiting for results queue to drain 7530 1727096023.89582: waiting for pending results... 7530 1727096023.89750: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096023.89841: in run() - task 0afff68d-5257-086b-f4f0-00000000001b 7530 1727096023.89854: variable 'ansible_search_path' from source: unknown 7530 1727096023.89857: variable 'ansible_search_path' from source: unknown 7530 1727096023.89888: calling self._execute() 7530 1727096023.89955: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.89959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.89969: variable 'omit' from source: magic vars 7530 1727096023.90241: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.90252: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096023.90393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096023.92377: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096023.92386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096023.92435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096023.92474: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096023.92497: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096023.92662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.92704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.92723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.92775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.92787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.92880: variable 'ansible_distribution' from source: facts 7530 1727096023.92884: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.92897: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7530 1727096023.93001: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096023.93094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.93111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.93130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.93158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.93170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.93198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.93213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.93232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.93258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.93270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.93297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096023.93313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096023.93332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.93355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096023.93365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096023.93472: variable 'network_connections' from source: task vars 7530 1727096023.93486: variable 'interface' from source: play vars 7530 1727096023.93540: variable 'interface' from source: play vars 7530 1727096023.93595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096023.93720: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096023.93750: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096023.93775: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096023.93798: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096023.93860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096023.93864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096023.93889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096023.93906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096023.93955: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096023.94127: variable 'network_connections' from source: task vars 7530 1727096023.94130: variable 'interface' from source: play vars 7530 1727096023.94177: variable 'interface' from source: play vars 7530 1727096023.94205: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096023.94209: when evaluation is False, skipping this task 7530 1727096023.94212: _execute() done 7530 1727096023.94214: dumping result to json 7530 1727096023.94216: done dumping result, returning 7530 1727096023.94226: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-00000000001b] 7530 1727096023.94228: sending task result for task 0afff68d-5257-086b-f4f0-00000000001b 7530 1727096023.94323: done sending task result for task 0afff68d-5257-086b-f4f0-00000000001b 7530 1727096023.94326: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096023.94433: no more pending results, returning what we have 7530 1727096023.94436: results queue empty 7530 1727096023.94437: checking for any_errors_fatal 7530 1727096023.94443: done checking for any_errors_fatal 7530 1727096023.94444: checking for max_fail_percentage 7530 1727096023.94446: done checking for max_fail_percentage 7530 1727096023.94446: checking to see if all hosts have failed and the running result is not ok 7530 1727096023.94447: done checking to see if all hosts have failed 7530 1727096023.94448: getting the remaining hosts for this loop 7530 1727096023.94450: done getting the remaining hosts for this loop 7530 1727096023.94453: getting the next task for host managed_node3 7530 1727096023.94459: done getting next task for host managed_node3 7530 1727096023.94463: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096023.94465: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096023.94481: getting variables 7530 1727096023.94483: in VariableManager get_vars() 7530 1727096023.94527: Calling all_inventory to load vars for managed_node3 7530 1727096023.94530: Calling groups_inventory to load vars for managed_node3 7530 1727096023.94532: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096023.94541: Calling all_plugins_play to load vars for managed_node3 7530 1727096023.94543: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096023.94545: Calling groups_plugins_play to load vars for managed_node3 7530 1727096023.95841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096023.97616: done with get_vars() 7530 1727096023.97641: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096023.97701: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:53:43 -0400 (0:00:00.084) 0:00:14.765 ****** 7530 1727096023.97727: entering _queue_task() for managed_node3/yum 7530 1727096023.97729: Creating lock for yum 7530 1727096023.97982: worker is 1 (out of 1 available) 7530 1727096023.97995: exiting _queue_task() for managed_node3/yum 7530 1727096023.98007: done queuing things up, now waiting for results queue to drain 7530 1727096023.98009: waiting for pending results... 7530 1727096023.98187: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096023.98285: in run() - task 0afff68d-5257-086b-f4f0-00000000001c 7530 1727096023.98296: variable 'ansible_search_path' from source: unknown 7530 1727096023.98300: variable 'ansible_search_path' from source: unknown 7530 1727096023.98328: calling self._execute() 7530 1727096023.98398: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096023.98402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096023.98410: variable 'omit' from source: magic vars 7530 1727096023.98682: variable 'ansible_distribution_major_version' from source: facts 7530 1727096023.98691: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096023.98814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096024.01948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096024.01992: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096024.02029: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096024.02074: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096024.02087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096024.02163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.02193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.02220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.02273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.02387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.02391: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.02394: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7530 1727096024.02397: when evaluation is False, skipping this task 7530 1727096024.02400: _execute() done 7530 1727096024.02402: dumping result to json 7530 1727096024.02404: done dumping result, returning 7530 1727096024.02406: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-00000000001c] 7530 1727096024.02424: sending task result for task 0afff68d-5257-086b-f4f0-00000000001c 7530 1727096024.02506: done sending task result for task 0afff68d-5257-086b-f4f0-00000000001c 7530 1727096024.02509: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7530 1727096024.02571: no more pending results, returning what we have 7530 1727096024.02575: results queue empty 7530 1727096024.02576: checking for any_errors_fatal 7530 1727096024.02583: done checking for any_errors_fatal 7530 1727096024.02584: checking for max_fail_percentage 7530 1727096024.02585: done checking for max_fail_percentage 7530 1727096024.02586: checking to see if all hosts have failed and the running result is not ok 7530 1727096024.02587: done checking to see if all hosts have failed 7530 1727096024.02588: getting the remaining hosts for this loop 7530 1727096024.02589: done getting the remaining hosts for this loop 7530 1727096024.02593: getting the next task for host managed_node3 7530 1727096024.02601: done getting next task for host managed_node3 7530 1727096024.02605: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096024.02608: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096024.02621: getting variables 7530 1727096024.02701: in VariableManager get_vars() 7530 1727096024.02782: Calling all_inventory to load vars for managed_node3 7530 1727096024.02785: Calling groups_inventory to load vars for managed_node3 7530 1727096024.02787: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096024.02795: Calling all_plugins_play to load vars for managed_node3 7530 1727096024.02797: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096024.02798: Calling groups_plugins_play to load vars for managed_node3 7530 1727096024.07052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096024.08271: done with get_vars() 7530 1727096024.08301: done getting variables 7530 1727096024.08351: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:53:44 -0400 (0:00:00.106) 0:00:14.872 ****** 7530 1727096024.08382: entering _queue_task() for managed_node3/fail 7530 1727096024.08796: worker is 1 (out of 1 available) 7530 1727096024.08808: exiting _queue_task() for managed_node3/fail 7530 1727096024.08819: done queuing things up, now waiting for results queue to drain 7530 1727096024.08821: waiting for pending results... 7530 1727096024.09015: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096024.09128: in run() - task 0afff68d-5257-086b-f4f0-00000000001d 7530 1727096024.09139: variable 'ansible_search_path' from source: unknown 7530 1727096024.09143: variable 'ansible_search_path' from source: unknown 7530 1727096024.09175: calling self._execute() 7530 1727096024.09246: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.09250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.09261: variable 'omit' from source: magic vars 7530 1727096024.09560: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.09572: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096024.09662: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096024.09798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096024.11714: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096024.11832: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096024.11836: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096024.11863: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096024.11889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096024.11971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.11999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.12045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.12061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.12139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.12143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.12146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.12163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.12202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.12216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.12254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.12277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.12357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.12360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.12363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.12588: variable 'network_connections' from source: task vars 7530 1727096024.12592: variable 'interface' from source: play vars 7530 1727096024.12612: variable 'interface' from source: play vars 7530 1727096024.12694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096024.12839: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096024.12877: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096024.12901: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096024.12927: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096024.12957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096024.12973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096024.12991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.13010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096024.13060: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096024.13231: variable 'network_connections' from source: task vars 7530 1727096024.13236: variable 'interface' from source: play vars 7530 1727096024.13287: variable 'interface' from source: play vars 7530 1727096024.13314: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096024.13318: when evaluation is False, skipping this task 7530 1727096024.13323: _execute() done 7530 1727096024.13326: dumping result to json 7530 1727096024.13328: done dumping result, returning 7530 1727096024.13331: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-00000000001d] 7530 1727096024.13336: sending task result for task 0afff68d-5257-086b-f4f0-00000000001d 7530 1727096024.13432: done sending task result for task 0afff68d-5257-086b-f4f0-00000000001d 7530 1727096024.13434: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096024.13511: no more pending results, returning what we have 7530 1727096024.13514: results queue empty 7530 1727096024.13515: checking for any_errors_fatal 7530 1727096024.13524: done checking for any_errors_fatal 7530 1727096024.13525: checking for max_fail_percentage 7530 1727096024.13527: done checking for max_fail_percentage 7530 1727096024.13527: checking to see if all hosts have failed and the running result is not ok 7530 1727096024.13529: done checking to see if all hosts have failed 7530 1727096024.13530: getting the remaining hosts for this loop 7530 1727096024.13531: done getting the remaining hosts for this loop 7530 1727096024.13534: getting the next task for host managed_node3 7530 1727096024.13541: done getting next task for host managed_node3 7530 1727096024.13544: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7530 1727096024.13546: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096024.13561: getting variables 7530 1727096024.13562: in VariableManager get_vars() 7530 1727096024.13608: Calling all_inventory to load vars for managed_node3 7530 1727096024.13611: Calling groups_inventory to load vars for managed_node3 7530 1727096024.13613: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096024.13624: Calling all_plugins_play to load vars for managed_node3 7530 1727096024.13627: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096024.13629: Calling groups_plugins_play to load vars for managed_node3 7530 1727096024.14429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096024.15845: done with get_vars() 7530 1727096024.15870: done getting variables 7530 1727096024.15927: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:53:44 -0400 (0:00:00.075) 0:00:14.947 ****** 7530 1727096024.15951: entering _queue_task() for managed_node3/package 7530 1727096024.16192: worker is 1 (out of 1 available) 7530 1727096024.16206: exiting _queue_task() for managed_node3/package 7530 1727096024.16218: done queuing things up, now waiting for results queue to drain 7530 1727096024.16219: waiting for pending results... 7530 1727096024.16403: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7530 1727096024.16497: in run() - task 0afff68d-5257-086b-f4f0-00000000001e 7530 1727096024.16508: variable 'ansible_search_path' from source: unknown 7530 1727096024.16511: variable 'ansible_search_path' from source: unknown 7530 1727096024.16544: calling self._execute() 7530 1727096024.16617: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.16621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.16633: variable 'omit' from source: magic vars 7530 1727096024.16921: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.16933: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096024.17074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096024.17266: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096024.17301: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096024.17366: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096024.17395: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096024.17480: variable 'network_packages' from source: role '' defaults 7530 1727096024.17555: variable '__network_provider_setup' from source: role '' defaults 7530 1727096024.17564: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096024.17615: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096024.17622: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096024.17672: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096024.17787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096024.19437: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096024.19480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096024.19511: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096024.19537: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096024.19557: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096024.19620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.19642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.19659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.19686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.19697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.19735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.19750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.19766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.19792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.19803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.19949: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096024.20038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.20064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.20083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.20107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.20117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.20187: variable 'ansible_python' from source: facts 7530 1727096024.20208: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096024.20272: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096024.20327: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096024.20414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.20433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.20450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.20479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.20490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.20521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.20543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.20558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.20589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.20600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.20698: variable 'network_connections' from source: task vars 7530 1727096024.20704: variable 'interface' from source: play vars 7530 1727096024.20776: variable 'interface' from source: play vars 7530 1727096024.20834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096024.20853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096024.20875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.20898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096024.20936: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096024.21116: variable 'network_connections' from source: task vars 7530 1727096024.21121: variable 'interface' from source: play vars 7530 1727096024.21191: variable 'interface' from source: play vars 7530 1727096024.21239: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096024.21291: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096024.21487: variable 'network_connections' from source: task vars 7530 1727096024.21491: variable 'interface' from source: play vars 7530 1727096024.21538: variable 'interface' from source: play vars 7530 1727096024.21564: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096024.21615: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096024.21811: variable 'network_connections' from source: task vars 7530 1727096024.21814: variable 'interface' from source: play vars 7530 1727096024.21862: variable 'interface' from source: play vars 7530 1727096024.21911: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096024.21955: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096024.21960: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096024.22007: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096024.22145: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096024.22444: variable 'network_connections' from source: task vars 7530 1727096024.22447: variable 'interface' from source: play vars 7530 1727096024.22492: variable 'interface' from source: play vars 7530 1727096024.22500: variable 'ansible_distribution' from source: facts 7530 1727096024.22503: variable '__network_rh_distros' from source: role '' defaults 7530 1727096024.22509: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.22530: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096024.22637: variable 'ansible_distribution' from source: facts 7530 1727096024.22640: variable '__network_rh_distros' from source: role '' defaults 7530 1727096024.22644: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.22657: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096024.22764: variable 'ansible_distribution' from source: facts 7530 1727096024.22769: variable '__network_rh_distros' from source: role '' defaults 7530 1727096024.22774: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.22800: variable 'network_provider' from source: set_fact 7530 1727096024.22811: variable 'ansible_facts' from source: unknown 7530 1727096024.23389: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7530 1727096024.23393: when evaluation is False, skipping this task 7530 1727096024.23395: _execute() done 7530 1727096024.23398: dumping result to json 7530 1727096024.23400: done dumping result, returning 7530 1727096024.23403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-086b-f4f0-00000000001e] 7530 1727096024.23405: sending task result for task 0afff68d-5257-086b-f4f0-00000000001e 7530 1727096024.23700: done sending task result for task 0afff68d-5257-086b-f4f0-00000000001e 7530 1727096024.23703: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7530 1727096024.23745: no more pending results, returning what we have 7530 1727096024.23748: results queue empty 7530 1727096024.23749: checking for any_errors_fatal 7530 1727096024.23754: done checking for any_errors_fatal 7530 1727096024.23754: checking for max_fail_percentage 7530 1727096024.23756: done checking for max_fail_percentage 7530 1727096024.23757: checking to see if all hosts have failed and the running result is not ok 7530 1727096024.23758: done checking to see if all hosts have failed 7530 1727096024.23758: getting the remaining hosts for this loop 7530 1727096024.23760: done getting the remaining hosts for this loop 7530 1727096024.23763: getting the next task for host managed_node3 7530 1727096024.23770: done getting next task for host managed_node3 7530 1727096024.23774: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096024.23777: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096024.23790: getting variables 7530 1727096024.23791: in VariableManager get_vars() 7530 1727096024.23850: Calling all_inventory to load vars for managed_node3 7530 1727096024.23853: Calling groups_inventory to load vars for managed_node3 7530 1727096024.23856: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096024.23866: Calling all_plugins_play to load vars for managed_node3 7530 1727096024.23873: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096024.23877: Calling groups_plugins_play to load vars for managed_node3 7530 1727096024.25488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096024.27215: done with get_vars() 7530 1727096024.27248: done getting variables 7530 1727096024.27328: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:53:44 -0400 (0:00:00.114) 0:00:15.061 ****** 7530 1727096024.27364: entering _queue_task() for managed_node3/package 7530 1727096024.27722: worker is 1 (out of 1 available) 7530 1727096024.27736: exiting _queue_task() for managed_node3/package 7530 1727096024.27749: done queuing things up, now waiting for results queue to drain 7530 1727096024.27751: waiting for pending results... 7530 1727096024.28053: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096024.28172: in run() - task 0afff68d-5257-086b-f4f0-00000000001f 7530 1727096024.28190: variable 'ansible_search_path' from source: unknown 7530 1727096024.28197: variable 'ansible_search_path' from source: unknown 7530 1727096024.28221: calling self._execute() 7530 1727096024.28298: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.28308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.28311: variable 'omit' from source: magic vars 7530 1727096024.28621: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.28629: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096024.28718: variable 'network_state' from source: role '' defaults 7530 1727096024.28729: Evaluated conditional (network_state != {}): False 7530 1727096024.28732: when evaluation is False, skipping this task 7530 1727096024.28734: _execute() done 7530 1727096024.28737: dumping result to json 7530 1727096024.28745: done dumping result, returning 7530 1727096024.28750: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-086b-f4f0-00000000001f] 7530 1727096024.28753: sending task result for task 0afff68d-5257-086b-f4f0-00000000001f 7530 1727096024.28851: done sending task result for task 0afff68d-5257-086b-f4f0-00000000001f 7530 1727096024.28854: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096024.28908: no more pending results, returning what we have 7530 1727096024.28912: results queue empty 7530 1727096024.28912: checking for any_errors_fatal 7530 1727096024.28917: done checking for any_errors_fatal 7530 1727096024.28918: checking for max_fail_percentage 7530 1727096024.28920: done checking for max_fail_percentage 7530 1727096024.28920: checking to see if all hosts have failed and the running result is not ok 7530 1727096024.28924: done checking to see if all hosts have failed 7530 1727096024.28924: getting the remaining hosts for this loop 7530 1727096024.28926: done getting the remaining hosts for this loop 7530 1727096024.28929: getting the next task for host managed_node3 7530 1727096024.28935: done getting next task for host managed_node3 7530 1727096024.28940: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096024.28943: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096024.28959: getting variables 7530 1727096024.28960: in VariableManager get_vars() 7530 1727096024.29009: Calling all_inventory to load vars for managed_node3 7530 1727096024.29012: Calling groups_inventory to load vars for managed_node3 7530 1727096024.29014: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096024.29026: Calling all_plugins_play to load vars for managed_node3 7530 1727096024.29028: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096024.29031: Calling groups_plugins_play to load vars for managed_node3 7530 1727096024.29793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096024.30660: done with get_vars() 7530 1727096024.30686: done getting variables 7530 1727096024.30734: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:53:44 -0400 (0:00:00.033) 0:00:15.095 ****** 7530 1727096024.30759: entering _queue_task() for managed_node3/package 7530 1727096024.31012: worker is 1 (out of 1 available) 7530 1727096024.31025: exiting _queue_task() for managed_node3/package 7530 1727096024.31037: done queuing things up, now waiting for results queue to drain 7530 1727096024.31039: waiting for pending results... 7530 1727096024.31225: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096024.31321: in run() - task 0afff68d-5257-086b-f4f0-000000000020 7530 1727096024.31336: variable 'ansible_search_path' from source: unknown 7530 1727096024.31339: variable 'ansible_search_path' from source: unknown 7530 1727096024.31370: calling self._execute() 7530 1727096024.31447: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.31451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.31460: variable 'omit' from source: magic vars 7530 1727096024.31755: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.31771: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096024.31878: variable 'network_state' from source: role '' defaults 7530 1727096024.31887: Evaluated conditional (network_state != {}): False 7530 1727096024.31891: when evaluation is False, skipping this task 7530 1727096024.31893: _execute() done 7530 1727096024.31896: dumping result to json 7530 1727096024.31898: done dumping result, returning 7530 1727096024.31906: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-086b-f4f0-000000000020] 7530 1727096024.31910: sending task result for task 0afff68d-5257-086b-f4f0-000000000020 7530 1727096024.32008: done sending task result for task 0afff68d-5257-086b-f4f0-000000000020 7530 1727096024.32011: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096024.32076: no more pending results, returning what we have 7530 1727096024.32079: results queue empty 7530 1727096024.32080: checking for any_errors_fatal 7530 1727096024.32089: done checking for any_errors_fatal 7530 1727096024.32089: checking for max_fail_percentage 7530 1727096024.32091: done checking for max_fail_percentage 7530 1727096024.32092: checking to see if all hosts have failed and the running result is not ok 7530 1727096024.32093: done checking to see if all hosts have failed 7530 1727096024.32094: getting the remaining hosts for this loop 7530 1727096024.32095: done getting the remaining hosts for this loop 7530 1727096024.32098: getting the next task for host managed_node3 7530 1727096024.32104: done getting next task for host managed_node3 7530 1727096024.32109: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096024.32112: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096024.32129: getting variables 7530 1727096024.32131: in VariableManager get_vars() 7530 1727096024.32176: Calling all_inventory to load vars for managed_node3 7530 1727096024.32179: Calling groups_inventory to load vars for managed_node3 7530 1727096024.32181: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096024.32190: Calling all_plugins_play to load vars for managed_node3 7530 1727096024.32192: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096024.32195: Calling groups_plugins_play to load vars for managed_node3 7530 1727096024.33065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096024.34462: done with get_vars() 7530 1727096024.34496: done getting variables 7530 1727096024.34577: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:53:44 -0400 (0:00:00.038) 0:00:15.134 ****** 7530 1727096024.34603: entering _queue_task() for managed_node3/service 7530 1727096024.34605: Creating lock for service 7530 1727096024.34860: worker is 1 (out of 1 available) 7530 1727096024.34875: exiting _queue_task() for managed_node3/service 7530 1727096024.34888: done queuing things up, now waiting for results queue to drain 7530 1727096024.34890: waiting for pending results... 7530 1727096024.35072: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096024.35176: in run() - task 0afff68d-5257-086b-f4f0-000000000021 7530 1727096024.35189: variable 'ansible_search_path' from source: unknown 7530 1727096024.35193: variable 'ansible_search_path' from source: unknown 7530 1727096024.35226: calling self._execute() 7530 1727096024.35297: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.35304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.35313: variable 'omit' from source: magic vars 7530 1727096024.35598: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.35608: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096024.35695: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096024.35829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096024.37981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096024.38038: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096024.38065: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096024.38092: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096024.38115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096024.38173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.38193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.38215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.38240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.38252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.38287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.38304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.38325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.38349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.38359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.38389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.38405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.38425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.38450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.38460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.38583: variable 'network_connections' from source: task vars 7530 1727096024.38594: variable 'interface' from source: play vars 7530 1727096024.38653: variable 'interface' from source: play vars 7530 1727096024.38707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096024.38827: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096024.38862: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096024.38888: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096024.38911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096024.38943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096024.38958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096024.38984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.38999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096024.39047: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096024.39239: variable 'network_connections' from source: task vars 7530 1727096024.39243: variable 'interface' from source: play vars 7530 1727096024.39336: variable 'interface' from source: play vars 7530 1727096024.39339: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096024.39344: when evaluation is False, skipping this task 7530 1727096024.39346: _execute() done 7530 1727096024.39349: dumping result to json 7530 1727096024.39351: done dumping result, returning 7530 1727096024.39354: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-000000000021] 7530 1727096024.39356: sending task result for task 0afff68d-5257-086b-f4f0-000000000021 7530 1727096024.39440: done sending task result for task 0afff68d-5257-086b-f4f0-000000000021 7530 1727096024.39449: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096024.39517: no more pending results, returning what we have 7530 1727096024.39521: results queue empty 7530 1727096024.39524: checking for any_errors_fatal 7530 1727096024.39531: done checking for any_errors_fatal 7530 1727096024.39531: checking for max_fail_percentage 7530 1727096024.39533: done checking for max_fail_percentage 7530 1727096024.39534: checking to see if all hosts have failed and the running result is not ok 7530 1727096024.39535: done checking to see if all hosts have failed 7530 1727096024.39535: getting the remaining hosts for this loop 7530 1727096024.39537: done getting the remaining hosts for this loop 7530 1727096024.39540: getting the next task for host managed_node3 7530 1727096024.39546: done getting next task for host managed_node3 7530 1727096024.39549: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096024.39552: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096024.39565: getting variables 7530 1727096024.39569: in VariableManager get_vars() 7530 1727096024.39613: Calling all_inventory to load vars for managed_node3 7530 1727096024.39616: Calling groups_inventory to load vars for managed_node3 7530 1727096024.39618: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096024.39629: Calling all_plugins_play to load vars for managed_node3 7530 1727096024.39632: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096024.39634: Calling groups_plugins_play to load vars for managed_node3 7530 1727096024.40916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096024.41916: done with get_vars() 7530 1727096024.41935: done getting variables 7530 1727096024.41984: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:53:44 -0400 (0:00:00.074) 0:00:15.208 ****** 7530 1727096024.42008: entering _queue_task() for managed_node3/service 7530 1727096024.42250: worker is 1 (out of 1 available) 7530 1727096024.42263: exiting _queue_task() for managed_node3/service 7530 1727096024.42276: done queuing things up, now waiting for results queue to drain 7530 1727096024.42278: waiting for pending results... 7530 1727096024.42464: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096024.42557: in run() - task 0afff68d-5257-086b-f4f0-000000000022 7530 1727096024.42570: variable 'ansible_search_path' from source: unknown 7530 1727096024.42574: variable 'ansible_search_path' from source: unknown 7530 1727096024.42612: calling self._execute() 7530 1727096024.42722: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.42729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.42732: variable 'omit' from source: magic vars 7530 1727096024.43174: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.43189: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096024.43319: variable 'network_provider' from source: set_fact 7530 1727096024.43323: variable 'network_state' from source: role '' defaults 7530 1727096024.43393: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7530 1727096024.43412: variable 'omit' from source: magic vars 7530 1727096024.43415: variable 'omit' from source: magic vars 7530 1727096024.43431: variable 'network_service_name' from source: role '' defaults 7530 1727096024.43507: variable 'network_service_name' from source: role '' defaults 7530 1727096024.43636: variable '__network_provider_setup' from source: role '' defaults 7530 1727096024.43639: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096024.43756: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096024.43759: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096024.43761: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096024.43961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096024.45692: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096024.45749: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096024.45779: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096024.45806: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096024.45829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096024.45891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.45911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.45934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.45960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.45973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.46006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.46021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.46044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.46070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.46081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.46234: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096024.46317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.46337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.46359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.46384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.46395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.46462: variable 'ansible_python' from source: facts 7530 1727096024.46482: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096024.46541: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096024.46772: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096024.46776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.46779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.46787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.46830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.46851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.46903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096024.46945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096024.46976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.47025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096024.47045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096024.47182: variable 'network_connections' from source: task vars 7530 1727096024.47194: variable 'interface' from source: play vars 7530 1727096024.47276: variable 'interface' from source: play vars 7530 1727096024.47409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096024.47619: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096024.47681: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096024.47722: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096024.47753: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096024.47802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096024.47823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096024.47847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096024.47877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096024.47915: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096024.48102: variable 'network_connections' from source: task vars 7530 1727096024.48109: variable 'interface' from source: play vars 7530 1727096024.48164: variable 'interface' from source: play vars 7530 1727096024.48210: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096024.48270: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096024.48458: variable 'network_connections' from source: task vars 7530 1727096024.48461: variable 'interface' from source: play vars 7530 1727096024.48512: variable 'interface' from source: play vars 7530 1727096024.48538: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096024.48593: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096024.48781: variable 'network_connections' from source: task vars 7530 1727096024.48785: variable 'interface' from source: play vars 7530 1727096024.48835: variable 'interface' from source: play vars 7530 1727096024.48886: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096024.48930: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096024.48936: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096024.48982: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096024.49118: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096024.49452: variable 'network_connections' from source: task vars 7530 1727096024.49455: variable 'interface' from source: play vars 7530 1727096024.49501: variable 'interface' from source: play vars 7530 1727096024.49511: variable 'ansible_distribution' from source: facts 7530 1727096024.49519: variable '__network_rh_distros' from source: role '' defaults 7530 1727096024.49526: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.49544: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096024.49662: variable 'ansible_distribution' from source: facts 7530 1727096024.49665: variable '__network_rh_distros' from source: role '' defaults 7530 1727096024.49673: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.49684: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096024.49798: variable 'ansible_distribution' from source: facts 7530 1727096024.49802: variable '__network_rh_distros' from source: role '' defaults 7530 1727096024.49807: variable 'ansible_distribution_major_version' from source: facts 7530 1727096024.49841: variable 'network_provider' from source: set_fact 7530 1727096024.49856: variable 'omit' from source: magic vars 7530 1727096024.49881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096024.49902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096024.49917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096024.49931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096024.49939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096024.49965: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096024.49970: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.49972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.50038: Set connection var ansible_pipelining to False 7530 1727096024.50044: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096024.50050: Set connection var ansible_timeout to 10 7530 1727096024.50060: Set connection var ansible_shell_executable to /bin/sh 7530 1727096024.50062: Set connection var ansible_shell_type to sh 7530 1727096024.50065: Set connection var ansible_connection to ssh 7530 1727096024.50086: variable 'ansible_shell_executable' from source: unknown 7530 1727096024.50089: variable 'ansible_connection' from source: unknown 7530 1727096024.50091: variable 'ansible_module_compression' from source: unknown 7530 1727096024.50093: variable 'ansible_shell_type' from source: unknown 7530 1727096024.50095: variable 'ansible_shell_executable' from source: unknown 7530 1727096024.50097: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096024.50101: variable 'ansible_pipelining' from source: unknown 7530 1727096024.50103: variable 'ansible_timeout' from source: unknown 7530 1727096024.50108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096024.50185: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096024.50195: variable 'omit' from source: magic vars 7530 1727096024.50204: starting attempt loop 7530 1727096024.50206: running the handler 7530 1727096024.50265: variable 'ansible_facts' from source: unknown 7530 1727096024.50734: _low_level_execute_command(): starting 7530 1727096024.50740: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096024.51247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096024.51254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096024.51259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096024.51311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096024.51314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096024.51316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096024.51362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096024.53115: stdout chunk (state=3): >>>/root <<< 7530 1727096024.53211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096024.53239: stderr chunk (state=3): >>><<< 7530 1727096024.53242: stdout chunk (state=3): >>><<< 7530 1727096024.53261: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096024.53274: _low_level_execute_command(): starting 7530 1727096024.53280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054 `" && echo ansible-tmp-1727096024.5326197-8159-152911448648054="` echo /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054 `" ) && sleep 0' 7530 1727096024.53748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096024.53753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096024.53755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096024.53757: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096024.53760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096024.53812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096024.53816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096024.53818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096024.53860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096024.55892: stdout chunk (state=3): >>>ansible-tmp-1727096024.5326197-8159-152911448648054=/root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054 <<< 7530 1727096024.55991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096024.56024: stderr chunk (state=3): >>><<< 7530 1727096024.56027: stdout chunk (state=3): >>><<< 7530 1727096024.56043: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096024.5326197-8159-152911448648054=/root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096024.56072: variable 'ansible_module_compression' from source: unknown 7530 1727096024.56118: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 7530 1727096024.56125: ANSIBALLZ: Acquiring lock 7530 1727096024.56127: ANSIBALLZ: Lock acquired: 139837168144544 7530 1727096024.56129: ANSIBALLZ: Creating module 7530 1727096024.77054: ANSIBALLZ: Writing module into payload 7530 1727096024.77164: ANSIBALLZ: Writing module 7530 1727096024.77191: ANSIBALLZ: Renaming module 7530 1727096024.77197: ANSIBALLZ: Done creating module 7530 1727096024.77232: variable 'ansible_facts' from source: unknown 7530 1727096024.77372: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/AnsiballZ_systemd.py 7530 1727096024.77509: Sending initial data 7530 1727096024.77513: Sent initial data (154 bytes) 7530 1727096024.78200: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096024.78208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096024.78270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096024.78296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096024.79976: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096024.80002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096024.80034: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp7fguoml3 /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/AnsiballZ_systemd.py <<< 7530 1727096024.80038: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/AnsiballZ_systemd.py" <<< 7530 1727096024.80069: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp7fguoml3" to remote "/root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/AnsiballZ_systemd.py" <<< 7530 1727096024.80072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/AnsiballZ_systemd.py" <<< 7530 1727096024.81442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096024.81446: stderr chunk (state=3): >>><<< 7530 1727096024.81448: stdout chunk (state=3): >>><<< 7530 1727096024.81492: done transferring module to remote 7530 1727096024.81503: _low_level_execute_command(): starting 7530 1727096024.81599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/ /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/AnsiballZ_systemd.py && sleep 0' 7530 1727096024.82222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096024.82265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096024.82272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096024.82330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096024.84283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096024.84303: stderr chunk (state=3): >>><<< 7530 1727096024.84318: stdout chunk (state=3): >>><<< 7530 1727096024.84338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096024.84426: _low_level_execute_command(): starting 7530 1727096024.84431: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/AnsiballZ_systemd.py && sleep 0' 7530 1727096024.85008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096024.85045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096024.85071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096024.85099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096024.85102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096024.85127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096024.85153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096024.85197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096024.85211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096024.85261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096025.15613: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9363456", "MemoryPeak": "9883648", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3361898496", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "84676000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "M<<< 7530 1727096025.15647: stdout chunk (state=3): >>>emoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target Netwo<<< 7530 1727096025.15652: stdout chunk (state=3): >>>rkManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7530 1727096025.17677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096025.17703: stderr chunk (state=3): >>><<< 7530 1727096025.17706: stdout chunk (state=3): >>><<< 7530 1727096025.17724: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9363456", "MemoryPeak": "9883648", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3361898496", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "84676000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096025.17847: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096025.17864: _low_level_execute_command(): starting 7530 1727096025.17871: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096024.5326197-8159-152911448648054/ > /dev/null 2>&1 && sleep 0' 7530 1727096025.18338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096025.18341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096025.18344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096025.18346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096025.18348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096025.18398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096025.18402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096025.18404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096025.18448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096025.20286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096025.20307: stderr chunk (state=3): >>><<< 7530 1727096025.20310: stdout chunk (state=3): >>><<< 7530 1727096025.20323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096025.20333: handler run complete 7530 1727096025.20380: attempt loop complete, returning result 7530 1727096025.20383: _execute() done 7530 1727096025.20386: dumping result to json 7530 1727096025.20397: done dumping result, returning 7530 1727096025.20405: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-086b-f4f0-000000000022] 7530 1727096025.20409: sending task result for task 0afff68d-5257-086b-f4f0-000000000022 7530 1727096025.20647: done sending task result for task 0afff68d-5257-086b-f4f0-000000000022 7530 1727096025.20649: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096025.20705: no more pending results, returning what we have 7530 1727096025.20708: results queue empty 7530 1727096025.20709: checking for any_errors_fatal 7530 1727096025.20718: done checking for any_errors_fatal 7530 1727096025.20719: checking for max_fail_percentage 7530 1727096025.20720: done checking for max_fail_percentage 7530 1727096025.20721: checking to see if all hosts have failed and the running result is not ok 7530 1727096025.20724: done checking to see if all hosts have failed 7530 1727096025.20725: getting the remaining hosts for this loop 7530 1727096025.20727: done getting the remaining hosts for this loop 7530 1727096025.20730: getting the next task for host managed_node3 7530 1727096025.20735: done getting next task for host managed_node3 7530 1727096025.20739: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096025.20741: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096025.20751: getting variables 7530 1727096025.20752: in VariableManager get_vars() 7530 1727096025.20797: Calling all_inventory to load vars for managed_node3 7530 1727096025.20800: Calling groups_inventory to load vars for managed_node3 7530 1727096025.20802: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096025.20812: Calling all_plugins_play to load vars for managed_node3 7530 1727096025.20815: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096025.20817: Calling groups_plugins_play to load vars for managed_node3 7530 1727096025.21591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096025.22450: done with get_vars() 7530 1727096025.22471: done getting variables 7530 1727096025.22516: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:53:45 -0400 (0:00:00.805) 0:00:16.013 ****** 7530 1727096025.22543: entering _queue_task() for managed_node3/service 7530 1727096025.22789: worker is 1 (out of 1 available) 7530 1727096025.22802: exiting _queue_task() for managed_node3/service 7530 1727096025.22814: done queuing things up, now waiting for results queue to drain 7530 1727096025.22816: waiting for pending results... 7530 1727096025.22999: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096025.23098: in run() - task 0afff68d-5257-086b-f4f0-000000000023 7530 1727096025.23109: variable 'ansible_search_path' from source: unknown 7530 1727096025.23113: variable 'ansible_search_path' from source: unknown 7530 1727096025.23142: calling self._execute() 7530 1727096025.23212: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096025.23217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096025.23227: variable 'omit' from source: magic vars 7530 1727096025.23502: variable 'ansible_distribution_major_version' from source: facts 7530 1727096025.23512: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096025.23596: variable 'network_provider' from source: set_fact 7530 1727096025.23600: Evaluated conditional (network_provider == "nm"): True 7530 1727096025.23664: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096025.23726: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096025.23849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096025.25491: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096025.25537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096025.25565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096025.25591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096025.25611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096025.25673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096025.25694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096025.25711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096025.25738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096025.25749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096025.25787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096025.25803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096025.25820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096025.25845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096025.25855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096025.25887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096025.25903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096025.25919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096025.25944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096025.25953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096025.26052: variable 'network_connections' from source: task vars 7530 1727096025.26064: variable 'interface' from source: play vars 7530 1727096025.26125: variable 'interface' from source: play vars 7530 1727096025.26178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096025.26304: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096025.26333: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096025.26355: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096025.26378: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096025.26408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096025.26428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096025.26444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096025.26461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096025.26500: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096025.26661: variable 'network_connections' from source: task vars 7530 1727096025.26665: variable 'interface' from source: play vars 7530 1727096025.26711: variable 'interface' from source: play vars 7530 1727096025.26746: Evaluated conditional (__network_wpa_supplicant_required): False 7530 1727096025.26749: when evaluation is False, skipping this task 7530 1727096025.26751: _execute() done 7530 1727096025.26754: dumping result to json 7530 1727096025.26756: done dumping result, returning 7530 1727096025.26766: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-086b-f4f0-000000000023] 7530 1727096025.26780: sending task result for task 0afff68d-5257-086b-f4f0-000000000023 7530 1727096025.26855: done sending task result for task 0afff68d-5257-086b-f4f0-000000000023 7530 1727096025.26857: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7530 1727096025.26913: no more pending results, returning what we have 7530 1727096025.26917: results queue empty 7530 1727096025.26918: checking for any_errors_fatal 7530 1727096025.26940: done checking for any_errors_fatal 7530 1727096025.26940: checking for max_fail_percentage 7530 1727096025.26942: done checking for max_fail_percentage 7530 1727096025.26943: checking to see if all hosts have failed and the running result is not ok 7530 1727096025.26943: done checking to see if all hosts have failed 7530 1727096025.26944: getting the remaining hosts for this loop 7530 1727096025.26945: done getting the remaining hosts for this loop 7530 1727096025.26949: getting the next task for host managed_node3 7530 1727096025.26956: done getting next task for host managed_node3 7530 1727096025.26960: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096025.26962: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096025.26979: getting variables 7530 1727096025.26980: in VariableManager get_vars() 7530 1727096025.27027: Calling all_inventory to load vars for managed_node3 7530 1727096025.27030: Calling groups_inventory to load vars for managed_node3 7530 1727096025.27032: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096025.27040: Calling all_plugins_play to load vars for managed_node3 7530 1727096025.27042: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096025.27045: Calling groups_plugins_play to load vars for managed_node3 7530 1727096025.27929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096025.29389: done with get_vars() 7530 1727096025.29427: done getting variables 7530 1727096025.29497: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:53:45 -0400 (0:00:00.069) 0:00:16.083 ****** 7530 1727096025.29536: entering _queue_task() for managed_node3/service 7530 1727096025.29880: worker is 1 (out of 1 available) 7530 1727096025.29895: exiting _queue_task() for managed_node3/service 7530 1727096025.29906: done queuing things up, now waiting for results queue to drain 7530 1727096025.29908: waiting for pending results... 7530 1727096025.30292: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096025.30385: in run() - task 0afff68d-5257-086b-f4f0-000000000024 7530 1727096025.30412: variable 'ansible_search_path' from source: unknown 7530 1727096025.30425: variable 'ansible_search_path' from source: unknown 7530 1727096025.30472: calling self._execute() 7530 1727096025.30577: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096025.30611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096025.30615: variable 'omit' from source: magic vars 7530 1727096025.31049: variable 'ansible_distribution_major_version' from source: facts 7530 1727096025.31059: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096025.31238: variable 'network_provider' from source: set_fact 7530 1727096025.31271: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096025.31275: when evaluation is False, skipping this task 7530 1727096025.31278: _execute() done 7530 1727096025.31280: dumping result to json 7530 1727096025.31283: done dumping result, returning 7530 1727096025.31376: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-086b-f4f0-000000000024] 7530 1727096025.31379: sending task result for task 0afff68d-5257-086b-f4f0-000000000024 7530 1727096025.31454: done sending task result for task 0afff68d-5257-086b-f4f0-000000000024 7530 1727096025.31458: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096025.31528: no more pending results, returning what we have 7530 1727096025.31532: results queue empty 7530 1727096025.31534: checking for any_errors_fatal 7530 1727096025.31544: done checking for any_errors_fatal 7530 1727096025.31544: checking for max_fail_percentage 7530 1727096025.31547: done checking for max_fail_percentage 7530 1727096025.31548: checking to see if all hosts have failed and the running result is not ok 7530 1727096025.31549: done checking to see if all hosts have failed 7530 1727096025.31550: getting the remaining hosts for this loop 7530 1727096025.31551: done getting the remaining hosts for this loop 7530 1727096025.31555: getting the next task for host managed_node3 7530 1727096025.31562: done getting next task for host managed_node3 7530 1727096025.31566: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096025.31572: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096025.31589: getting variables 7530 1727096025.31591: in VariableManager get_vars() 7530 1727096025.31648: Calling all_inventory to load vars for managed_node3 7530 1727096025.31651: Calling groups_inventory to load vars for managed_node3 7530 1727096025.31654: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096025.31769: Calling all_plugins_play to load vars for managed_node3 7530 1727096025.31775: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096025.31779: Calling groups_plugins_play to load vars for managed_node3 7530 1727096025.33811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096025.34828: done with get_vars() 7530 1727096025.34846: done getting variables 7530 1727096025.34892: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:53:45 -0400 (0:00:00.053) 0:00:16.137 ****** 7530 1727096025.34918: entering _queue_task() for managed_node3/copy 7530 1727096025.35163: worker is 1 (out of 1 available) 7530 1727096025.35177: exiting _queue_task() for managed_node3/copy 7530 1727096025.35190: done queuing things up, now waiting for results queue to drain 7530 1727096025.35192: waiting for pending results... 7530 1727096025.35377: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096025.35465: in run() - task 0afff68d-5257-086b-f4f0-000000000025 7530 1727096025.35479: variable 'ansible_search_path' from source: unknown 7530 1727096025.35483: variable 'ansible_search_path' from source: unknown 7530 1727096025.35511: calling self._execute() 7530 1727096025.35590: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096025.35595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096025.35602: variable 'omit' from source: magic vars 7530 1727096025.36073: variable 'ansible_distribution_major_version' from source: facts 7530 1727096025.36077: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096025.36101: variable 'network_provider' from source: set_fact 7530 1727096025.36110: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096025.36118: when evaluation is False, skipping this task 7530 1727096025.36129: _execute() done 7530 1727096025.36136: dumping result to json 7530 1727096025.36143: done dumping result, returning 7530 1727096025.36155: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-086b-f4f0-000000000025] 7530 1727096025.36163: sending task result for task 0afff68d-5257-086b-f4f0-000000000025 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7530 1727096025.36416: no more pending results, returning what we have 7530 1727096025.36420: results queue empty 7530 1727096025.36421: checking for any_errors_fatal 7530 1727096025.36426: done checking for any_errors_fatal 7530 1727096025.36427: checking for max_fail_percentage 7530 1727096025.36429: done checking for max_fail_percentage 7530 1727096025.36429: checking to see if all hosts have failed and the running result is not ok 7530 1727096025.36430: done checking to see if all hosts have failed 7530 1727096025.36431: getting the remaining hosts for this loop 7530 1727096025.36432: done getting the remaining hosts for this loop 7530 1727096025.36436: getting the next task for host managed_node3 7530 1727096025.36442: done getting next task for host managed_node3 7530 1727096025.36446: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096025.36448: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096025.36466: getting variables 7530 1727096025.36485: in VariableManager get_vars() 7530 1727096025.36527: Calling all_inventory to load vars for managed_node3 7530 1727096025.36529: Calling groups_inventory to load vars for managed_node3 7530 1727096025.36532: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096025.36542: Calling all_plugins_play to load vars for managed_node3 7530 1727096025.36545: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096025.36548: Calling groups_plugins_play to load vars for managed_node3 7530 1727096025.37070: done sending task result for task 0afff68d-5257-086b-f4f0-000000000025 7530 1727096025.37073: WORKER PROCESS EXITING 7530 1727096025.37878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096025.39238: done with get_vars() 7530 1727096025.39263: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:53:45 -0400 (0:00:00.044) 0:00:16.181 ****** 7530 1727096025.39334: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096025.39335: Creating lock for fedora.linux_system_roles.network_connections 7530 1727096025.39593: worker is 1 (out of 1 available) 7530 1727096025.39607: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096025.39618: done queuing things up, now waiting for results queue to drain 7530 1727096025.39620: waiting for pending results... 7530 1727096025.39807: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096025.39915: in run() - task 0afff68d-5257-086b-f4f0-000000000026 7530 1727096025.39929: variable 'ansible_search_path' from source: unknown 7530 1727096025.39933: variable 'ansible_search_path' from source: unknown 7530 1727096025.39963: calling self._execute() 7530 1727096025.40038: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096025.40042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096025.40050: variable 'omit' from source: magic vars 7530 1727096025.40341: variable 'ansible_distribution_major_version' from source: facts 7530 1727096025.40351: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096025.40357: variable 'omit' from source: magic vars 7530 1727096025.40401: variable 'omit' from source: magic vars 7530 1727096025.40517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096025.42386: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096025.42432: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096025.42460: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096025.42489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096025.42511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096025.42572: variable 'network_provider' from source: set_fact 7530 1727096025.42674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096025.42708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096025.42727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096025.42753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096025.42764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096025.42818: variable 'omit' from source: magic vars 7530 1727096025.42899: variable 'omit' from source: magic vars 7530 1727096025.42972: variable 'network_connections' from source: task vars 7530 1727096025.42983: variable 'interface' from source: play vars 7530 1727096025.43034: variable 'interface' from source: play vars 7530 1727096025.43151: variable 'omit' from source: magic vars 7530 1727096025.43158: variable '__lsr_ansible_managed' from source: task vars 7530 1727096025.43201: variable '__lsr_ansible_managed' from source: task vars 7530 1727096025.43633: Loaded config def from plugin (lookup/template) 7530 1727096025.43637: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7530 1727096025.43658: File lookup term: get_ansible_managed.j2 7530 1727096025.43661: variable 'ansible_search_path' from source: unknown 7530 1727096025.43666: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7530 1727096025.43679: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7530 1727096025.43695: variable 'ansible_search_path' from source: unknown 7530 1727096025.48174: variable 'ansible_managed' from source: unknown 7530 1727096025.48179: variable 'omit' from source: magic vars 7530 1727096025.48205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096025.48241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096025.48264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096025.48291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096025.48305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096025.48340: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096025.48349: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096025.48357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096025.48452: Set connection var ansible_pipelining to False 7530 1727096025.48463: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096025.48475: Set connection var ansible_timeout to 10 7530 1727096025.48488: Set connection var ansible_shell_executable to /bin/sh 7530 1727096025.48494: Set connection var ansible_shell_type to sh 7530 1727096025.48499: Set connection var ansible_connection to ssh 7530 1727096025.48529: variable 'ansible_shell_executable' from source: unknown 7530 1727096025.48537: variable 'ansible_connection' from source: unknown 7530 1727096025.48544: variable 'ansible_module_compression' from source: unknown 7530 1727096025.48550: variable 'ansible_shell_type' from source: unknown 7530 1727096025.48556: variable 'ansible_shell_executable' from source: unknown 7530 1727096025.48562: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096025.48572: variable 'ansible_pipelining' from source: unknown 7530 1727096025.48578: variable 'ansible_timeout' from source: unknown 7530 1727096025.48585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096025.48728: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096025.48755: variable 'omit' from source: magic vars 7530 1727096025.48765: starting attempt loop 7530 1727096025.48775: running the handler 7530 1727096025.48793: _low_level_execute_command(): starting 7530 1727096025.48806: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096025.49351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096025.49356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096025.49358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096025.49415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096025.49418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096025.49421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096025.49464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096025.51161: stdout chunk (state=3): >>>/root <<< 7530 1727096025.51317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096025.51321: stdout chunk (state=3): >>><<< 7530 1727096025.51323: stderr chunk (state=3): >>><<< 7530 1727096025.51345: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096025.51375: _low_level_execute_command(): starting 7530 1727096025.51486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510 `" && echo ansible-tmp-1727096025.5135932-8195-191112001063510="` echo /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510 `" ) && sleep 0' 7530 1727096025.51893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096025.51908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096025.51919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096025.51966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096025.51984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096025.52024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096025.54029: stdout chunk (state=3): >>>ansible-tmp-1727096025.5135932-8195-191112001063510=/root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510 <<< 7530 1727096025.54155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096025.54159: stdout chunk (state=3): >>><<< 7530 1727096025.54164: stderr chunk (state=3): >>><<< 7530 1727096025.54186: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096025.5135932-8195-191112001063510=/root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096025.54228: variable 'ansible_module_compression' from source: unknown 7530 1727096025.54269: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 7530 1727096025.54273: ANSIBALLZ: Acquiring lock 7530 1727096025.54276: ANSIBALLZ: Lock acquired: 139837168252928 7530 1727096025.54278: ANSIBALLZ: Creating module 7530 1727096025.68265: ANSIBALLZ: Writing module into payload 7530 1727096025.68489: ANSIBALLZ: Writing module 7530 1727096025.68508: ANSIBALLZ: Renaming module 7530 1727096025.68514: ANSIBALLZ: Done creating module 7530 1727096025.68535: variable 'ansible_facts' from source: unknown 7530 1727096025.68602: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/AnsiballZ_network_connections.py 7530 1727096025.68704: Sending initial data 7530 1727096025.68707: Sent initial data (166 bytes) 7530 1727096025.69350: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096025.69365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096025.69388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096025.69406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096025.69458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096025.69529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096025.69565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096025.69629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096025.71271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096025.71296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096025.71326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5hqzwynq /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/AnsiballZ_network_connections.py <<< 7530 1727096025.71332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/AnsiballZ_network_connections.py" <<< 7530 1727096025.71357: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5hqzwynq" to remote "/root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/AnsiballZ_network_connections.py" <<< 7530 1727096025.71363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/AnsiballZ_network_connections.py" <<< 7530 1727096025.72375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096025.72379: stdout chunk (state=3): >>><<< 7530 1727096025.72381: stderr chunk (state=3): >>><<< 7530 1727096025.72396: done transferring module to remote 7530 1727096025.72478: _low_level_execute_command(): starting 7530 1727096025.72482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/ /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/AnsiballZ_network_connections.py && sleep 0' 7530 1727096025.72945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096025.72965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096025.73011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096025.73027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096025.73069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096025.74955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096025.74959: stdout chunk (state=3): >>><<< 7530 1727096025.74962: stderr chunk (state=3): >>><<< 7530 1727096025.74983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096025.75066: _low_level_execute_command(): starting 7530 1727096025.75277: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/AnsiballZ_network_connections.py && sleep 0' 7530 1727096025.75619: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096025.75638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096025.75655: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096025.75701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096025.75720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096025.75766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096026.46576: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7530 1727096026.48693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096026.48697: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096026.48699: stdout chunk (state=3): >>><<< 7530 1727096026.48701: stderr chunk (state=3): >>><<< 7530 1727096026.48703: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096026.48715: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': True, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1', 'route_metric4': 65535}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096026.48731: _low_level_execute_command(): starting 7530 1727096026.48741: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096025.5135932-8195-191112001063510/ > /dev/null 2>&1 && sleep 0' 7530 1727096026.49482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096026.49496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096026.49546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096026.49559: stderr chunk (state=3): >>>debug2: match found <<< 7530 1727096026.49576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096026.49657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096026.49676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096026.49699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096026.49796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096026.51702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096026.51764: stderr chunk (state=3): >>><<< 7530 1727096026.51892: stdout chunk (state=3): >>><<< 7530 1727096026.51974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096026.51982: handler run complete 7530 1727096026.51985: attempt loop complete, returning result 7530 1727096026.51987: _execute() done 7530 1727096026.51989: dumping result to json 7530 1727096026.51992: done dumping result, returning 7530 1727096026.52273: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-086b-f4f0-000000000026] 7530 1727096026.52276: sending task result for task 0afff68d-5257-086b-f4f0-000000000026 7530 1727096026.52356: done sending task result for task 0afff68d-5257-086b-f4f0-000000000026 7530 1727096026.52360: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3 (not-active) 7530 1727096026.52466: no more pending results, returning what we have 7530 1727096026.52471: results queue empty 7530 1727096026.52472: checking for any_errors_fatal 7530 1727096026.52478: done checking for any_errors_fatal 7530 1727096026.52479: checking for max_fail_percentage 7530 1727096026.52481: done checking for max_fail_percentage 7530 1727096026.52481: checking to see if all hosts have failed and the running result is not ok 7530 1727096026.52482: done checking to see if all hosts have failed 7530 1727096026.52483: getting the remaining hosts for this loop 7530 1727096026.52484: done getting the remaining hosts for this loop 7530 1727096026.52488: getting the next task for host managed_node3 7530 1727096026.52494: done getting next task for host managed_node3 7530 1727096026.52498: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096026.52500: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096026.52509: getting variables 7530 1727096026.52511: in VariableManager get_vars() 7530 1727096026.52557: Calling all_inventory to load vars for managed_node3 7530 1727096026.52559: Calling groups_inventory to load vars for managed_node3 7530 1727096026.52562: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096026.53078: Calling all_plugins_play to load vars for managed_node3 7530 1727096026.53082: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096026.53087: Calling groups_plugins_play to load vars for managed_node3 7530 1727096026.56214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096026.59576: done with get_vars() 7530 1727096026.59611: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:53:46 -0400 (0:00:01.203) 0:00:17.385 ****** 7530 1727096026.59695: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096026.59697: Creating lock for fedora.linux_system_roles.network_state 7530 1727096026.60335: worker is 1 (out of 1 available) 7530 1727096026.60348: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096026.60361: done queuing things up, now waiting for results queue to drain 7530 1727096026.60362: waiting for pending results... 7530 1727096026.60675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096026.60838: in run() - task 0afff68d-5257-086b-f4f0-000000000027 7530 1727096026.60861: variable 'ansible_search_path' from source: unknown 7530 1727096026.60873: variable 'ansible_search_path' from source: unknown 7530 1727096026.60974: calling self._execute() 7530 1727096026.61036: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.61048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.61063: variable 'omit' from source: magic vars 7530 1727096026.61476: variable 'ansible_distribution_major_version' from source: facts 7530 1727096026.61497: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096026.61631: variable 'network_state' from source: role '' defaults 7530 1727096026.61647: Evaluated conditional (network_state != {}): False 7530 1727096026.61654: when evaluation is False, skipping this task 7530 1727096026.61680: _execute() done 7530 1727096026.61683: dumping result to json 7530 1727096026.61685: done dumping result, returning 7530 1727096026.61773: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-086b-f4f0-000000000027] 7530 1727096026.61784: sending task result for task 0afff68d-5257-086b-f4f0-000000000027 7530 1727096026.61859: done sending task result for task 0afff68d-5257-086b-f4f0-000000000027 7530 1727096026.61863: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096026.61926: no more pending results, returning what we have 7530 1727096026.61931: results queue empty 7530 1727096026.61932: checking for any_errors_fatal 7530 1727096026.61943: done checking for any_errors_fatal 7530 1727096026.61944: checking for max_fail_percentage 7530 1727096026.61946: done checking for max_fail_percentage 7530 1727096026.61947: checking to see if all hosts have failed and the running result is not ok 7530 1727096026.61948: done checking to see if all hosts have failed 7530 1727096026.61949: getting the remaining hosts for this loop 7530 1727096026.61950: done getting the remaining hosts for this loop 7530 1727096026.61954: getting the next task for host managed_node3 7530 1727096026.61961: done getting next task for host managed_node3 7530 1727096026.61966: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096026.61971: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096026.61988: getting variables 7530 1727096026.61989: in VariableManager get_vars() 7530 1727096026.62041: Calling all_inventory to load vars for managed_node3 7530 1727096026.62044: Calling groups_inventory to load vars for managed_node3 7530 1727096026.62047: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096026.62059: Calling all_plugins_play to load vars for managed_node3 7530 1727096026.62062: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096026.62065: Calling groups_plugins_play to load vars for managed_node3 7530 1727096026.63757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096026.65407: done with get_vars() 7530 1727096026.65445: done getting variables 7530 1727096026.65505: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:53:46 -0400 (0:00:00.058) 0:00:17.443 ****** 7530 1727096026.65543: entering _queue_task() for managed_node3/debug 7530 1727096026.65986: worker is 1 (out of 1 available) 7530 1727096026.65997: exiting _queue_task() for managed_node3/debug 7530 1727096026.66007: done queuing things up, now waiting for results queue to drain 7530 1727096026.66008: waiting for pending results... 7530 1727096026.66396: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096026.66401: in run() - task 0afff68d-5257-086b-f4f0-000000000028 7530 1727096026.66405: variable 'ansible_search_path' from source: unknown 7530 1727096026.66407: variable 'ansible_search_path' from source: unknown 7530 1727096026.66410: calling self._execute() 7530 1727096026.66493: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.66519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.66604: variable 'omit' from source: magic vars 7530 1727096026.66971: variable 'ansible_distribution_major_version' from source: facts 7530 1727096026.66991: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096026.67004: variable 'omit' from source: magic vars 7530 1727096026.67077: variable 'omit' from source: magic vars 7530 1727096026.67116: variable 'omit' from source: magic vars 7530 1727096026.67378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096026.67382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096026.67384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096026.67386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096026.67427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096026.67461: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096026.67484: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.67495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.67617: Set connection var ansible_pipelining to False 7530 1727096026.67630: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096026.67639: Set connection var ansible_timeout to 10 7530 1727096026.67651: Set connection var ansible_shell_executable to /bin/sh 7530 1727096026.67657: Set connection var ansible_shell_type to sh 7530 1727096026.67663: Set connection var ansible_connection to ssh 7530 1727096026.67796: variable 'ansible_shell_executable' from source: unknown 7530 1727096026.67802: variable 'ansible_connection' from source: unknown 7530 1727096026.67810: variable 'ansible_module_compression' from source: unknown 7530 1727096026.67813: variable 'ansible_shell_type' from source: unknown 7530 1727096026.67815: variable 'ansible_shell_executable' from source: unknown 7530 1727096026.67817: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.67819: variable 'ansible_pipelining' from source: unknown 7530 1727096026.67821: variable 'ansible_timeout' from source: unknown 7530 1727096026.67823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.67918: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096026.67937: variable 'omit' from source: magic vars 7530 1727096026.68017: starting attempt loop 7530 1727096026.68022: running the handler 7530 1727096026.68126: variable '__network_connections_result' from source: set_fact 7530 1727096026.68172: handler run complete 7530 1727096026.68196: attempt loop complete, returning result 7530 1727096026.68205: _execute() done 7530 1727096026.68211: dumping result to json 7530 1727096026.68234: done dumping result, returning 7530 1727096026.68237: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-086b-f4f0-000000000028] 7530 1727096026.68249: sending task result for task 0afff68d-5257-086b-f4f0-000000000028 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3 (not-active)" ] } 7530 1727096026.68517: no more pending results, returning what we have 7530 1727096026.68521: results queue empty 7530 1727096026.68522: checking for any_errors_fatal 7530 1727096026.68529: done checking for any_errors_fatal 7530 1727096026.68529: checking for max_fail_percentage 7530 1727096026.68531: done checking for max_fail_percentage 7530 1727096026.68532: checking to see if all hosts have failed and the running result is not ok 7530 1727096026.68533: done checking to see if all hosts have failed 7530 1727096026.68534: getting the remaining hosts for this loop 7530 1727096026.68535: done getting the remaining hosts for this loop 7530 1727096026.68539: getting the next task for host managed_node3 7530 1727096026.68546: done getting next task for host managed_node3 7530 1727096026.68550: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096026.68553: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096026.68566: getting variables 7530 1727096026.68569: in VariableManager get_vars() 7530 1727096026.68621: Calling all_inventory to load vars for managed_node3 7530 1727096026.68625: Calling groups_inventory to load vars for managed_node3 7530 1727096026.68627: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096026.68639: Calling all_plugins_play to load vars for managed_node3 7530 1727096026.68642: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096026.68645: Calling groups_plugins_play to load vars for managed_node3 7530 1727096026.69297: done sending task result for task 0afff68d-5257-086b-f4f0-000000000028 7530 1727096026.69300: WORKER PROCESS EXITING 7530 1727096026.70452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096026.73913: done with get_vars() 7530 1727096026.73950: done getting variables 7530 1727096026.74020: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:53:46 -0400 (0:00:00.085) 0:00:17.528 ****** 7530 1727096026.74056: entering _queue_task() for managed_node3/debug 7530 1727096026.74400: worker is 1 (out of 1 available) 7530 1727096026.74413: exiting _queue_task() for managed_node3/debug 7530 1727096026.74426: done queuing things up, now waiting for results queue to drain 7530 1727096026.74427: waiting for pending results... 7530 1727096026.74686: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096026.74831: in run() - task 0afff68d-5257-086b-f4f0-000000000029 7530 1727096026.74848: variable 'ansible_search_path' from source: unknown 7530 1727096026.74852: variable 'ansible_search_path' from source: unknown 7530 1727096026.74891: calling self._execute() 7530 1727096026.74984: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.74990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.75001: variable 'omit' from source: magic vars 7530 1727096026.75391: variable 'ansible_distribution_major_version' from source: facts 7530 1727096026.75405: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096026.75477: variable 'omit' from source: magic vars 7530 1727096026.75481: variable 'omit' from source: magic vars 7530 1727096026.75512: variable 'omit' from source: magic vars 7530 1727096026.75556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096026.75598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096026.75617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096026.75638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096026.75651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096026.75682: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096026.75810: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.75814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.76021: Set connection var ansible_pipelining to False 7530 1727096026.76024: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096026.76036: Set connection var ansible_timeout to 10 7530 1727096026.76048: Set connection var ansible_shell_executable to /bin/sh 7530 1727096026.76050: Set connection var ansible_shell_type to sh 7530 1727096026.76052: Set connection var ansible_connection to ssh 7530 1727096026.76214: variable 'ansible_shell_executable' from source: unknown 7530 1727096026.76217: variable 'ansible_connection' from source: unknown 7530 1727096026.76220: variable 'ansible_module_compression' from source: unknown 7530 1727096026.76223: variable 'ansible_shell_type' from source: unknown 7530 1727096026.76225: variable 'ansible_shell_executable' from source: unknown 7530 1727096026.76236: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.76239: variable 'ansible_pipelining' from source: unknown 7530 1727096026.76241: variable 'ansible_timeout' from source: unknown 7530 1727096026.76243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.76657: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096026.76675: variable 'omit' from source: magic vars 7530 1727096026.76802: starting attempt loop 7530 1727096026.76806: running the handler 7530 1727096026.77076: variable '__network_connections_result' from source: set_fact 7530 1727096026.77262: variable '__network_connections_result' from source: set_fact 7530 1727096026.77822: handler run complete 7530 1727096026.77991: attempt loop complete, returning result 7530 1727096026.77994: _execute() done 7530 1727096026.77997: dumping result to json 7530 1727096026.77999: done dumping result, returning 7530 1727096026.78097: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-086b-f4f0-000000000029] 7530 1727096026.78100: sending task result for task 0afff68d-5257-086b-f4f0-000000000029 7530 1727096026.78573: done sending task result for task 0afff68d-5257-086b-f4f0-000000000029 7530 1727096026.78576: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 863f1165-7589-4aec-bdb1-d3d32b99b3c3 (not-active)" ] } } 7530 1727096026.78681: no more pending results, returning what we have 7530 1727096026.78685: results queue empty 7530 1727096026.78687: checking for any_errors_fatal 7530 1727096026.78693: done checking for any_errors_fatal 7530 1727096026.78694: checking for max_fail_percentage 7530 1727096026.78696: done checking for max_fail_percentage 7530 1727096026.78696: checking to see if all hosts have failed and the running result is not ok 7530 1727096026.78697: done checking to see if all hosts have failed 7530 1727096026.78698: getting the remaining hosts for this loop 7530 1727096026.78699: done getting the remaining hosts for this loop 7530 1727096026.78703: getting the next task for host managed_node3 7530 1727096026.78710: done getting next task for host managed_node3 7530 1727096026.78715: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096026.78718: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096026.78729: getting variables 7530 1727096026.78736: in VariableManager get_vars() 7530 1727096026.78904: Calling all_inventory to load vars for managed_node3 7530 1727096026.78908: Calling groups_inventory to load vars for managed_node3 7530 1727096026.78910: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096026.78920: Calling all_plugins_play to load vars for managed_node3 7530 1727096026.78922: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096026.78925: Calling groups_plugins_play to load vars for managed_node3 7530 1727096026.80136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096026.81056: done with get_vars() 7530 1727096026.81081: done getting variables 7530 1727096026.81127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:53:46 -0400 (0:00:00.070) 0:00:17.599 ****** 7530 1727096026.81150: entering _queue_task() for managed_node3/debug 7530 1727096026.81400: worker is 1 (out of 1 available) 7530 1727096026.81413: exiting _queue_task() for managed_node3/debug 7530 1727096026.81428: done queuing things up, now waiting for results queue to drain 7530 1727096026.81430: waiting for pending results... 7530 1727096026.81660: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096026.81989: in run() - task 0afff68d-5257-086b-f4f0-00000000002a 7530 1727096026.81993: variable 'ansible_search_path' from source: unknown 7530 1727096026.81995: variable 'ansible_search_path' from source: unknown 7530 1727096026.81998: calling self._execute() 7530 1727096026.82001: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.82003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.82005: variable 'omit' from source: magic vars 7530 1727096026.82359: variable 'ansible_distribution_major_version' from source: facts 7530 1727096026.82371: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096026.82502: variable 'network_state' from source: role '' defaults 7530 1727096026.82513: Evaluated conditional (network_state != {}): False 7530 1727096026.82516: when evaluation is False, skipping this task 7530 1727096026.82518: _execute() done 7530 1727096026.82521: dumping result to json 7530 1727096026.82526: done dumping result, returning 7530 1727096026.82532: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-086b-f4f0-00000000002a] 7530 1727096026.82535: sending task result for task 0afff68d-5257-086b-f4f0-00000000002a 7530 1727096026.82626: done sending task result for task 0afff68d-5257-086b-f4f0-00000000002a 7530 1727096026.82629: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7530 1727096026.82707: no more pending results, returning what we have 7530 1727096026.82711: results queue empty 7530 1727096026.82712: checking for any_errors_fatal 7530 1727096026.82725: done checking for any_errors_fatal 7530 1727096026.82726: checking for max_fail_percentage 7530 1727096026.82728: done checking for max_fail_percentage 7530 1727096026.82729: checking to see if all hosts have failed and the running result is not ok 7530 1727096026.82730: done checking to see if all hosts have failed 7530 1727096026.82730: getting the remaining hosts for this loop 7530 1727096026.82732: done getting the remaining hosts for this loop 7530 1727096026.82735: getting the next task for host managed_node3 7530 1727096026.82742: done getting next task for host managed_node3 7530 1727096026.82745: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096026.82748: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096026.82762: getting variables 7530 1727096026.82763: in VariableManager get_vars() 7530 1727096026.82812: Calling all_inventory to load vars for managed_node3 7530 1727096026.82815: Calling groups_inventory to load vars for managed_node3 7530 1727096026.82817: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096026.82829: Calling all_plugins_play to load vars for managed_node3 7530 1727096026.82831: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096026.82834: Calling groups_plugins_play to load vars for managed_node3 7530 1727096026.83731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096026.84580: done with get_vars() 7530 1727096026.84598: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:53:46 -0400 (0:00:00.035) 0:00:17.635 ****** 7530 1727096026.84673: entering _queue_task() for managed_node3/ping 7530 1727096026.84674: Creating lock for ping 7530 1727096026.84915: worker is 1 (out of 1 available) 7530 1727096026.84928: exiting _queue_task() for managed_node3/ping 7530 1727096026.84939: done queuing things up, now waiting for results queue to drain 7530 1727096026.84941: waiting for pending results... 7530 1727096026.85386: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096026.85392: in run() - task 0afff68d-5257-086b-f4f0-00000000002b 7530 1727096026.85394: variable 'ansible_search_path' from source: unknown 7530 1727096026.85397: variable 'ansible_search_path' from source: unknown 7530 1727096026.85401: calling self._execute() 7530 1727096026.85440: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.85444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.85447: variable 'omit' from source: magic vars 7530 1727096026.85805: variable 'ansible_distribution_major_version' from source: facts 7530 1727096026.85816: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096026.85825: variable 'omit' from source: magic vars 7530 1727096026.85885: variable 'omit' from source: magic vars 7530 1727096026.85920: variable 'omit' from source: magic vars 7530 1727096026.85961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096026.85996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096026.86017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096026.86034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096026.86045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096026.86079: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096026.86082: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.86085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.86181: Set connection var ansible_pipelining to False 7530 1727096026.86187: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096026.86193: Set connection var ansible_timeout to 10 7530 1727096026.86203: Set connection var ansible_shell_executable to /bin/sh 7530 1727096026.86206: Set connection var ansible_shell_type to sh 7530 1727096026.86208: Set connection var ansible_connection to ssh 7530 1727096026.86233: variable 'ansible_shell_executable' from source: unknown 7530 1727096026.86236: variable 'ansible_connection' from source: unknown 7530 1727096026.86239: variable 'ansible_module_compression' from source: unknown 7530 1727096026.86242: variable 'ansible_shell_type' from source: unknown 7530 1727096026.86244: variable 'ansible_shell_executable' from source: unknown 7530 1727096026.86246: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096026.86250: variable 'ansible_pipelining' from source: unknown 7530 1727096026.86253: variable 'ansible_timeout' from source: unknown 7530 1727096026.86257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096026.86536: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096026.86541: variable 'omit' from source: magic vars 7530 1727096026.86543: starting attempt loop 7530 1727096026.86546: running the handler 7530 1727096026.86549: _low_level_execute_command(): starting 7530 1727096026.86550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096026.87297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096026.87316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096026.87329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096026.87402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096026.89108: stdout chunk (state=3): >>>/root <<< 7530 1727096026.89258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096026.89262: stdout chunk (state=3): >>><<< 7530 1727096026.89265: stderr chunk (state=3): >>><<< 7530 1727096026.89288: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096026.89308: _low_level_execute_command(): starting 7530 1727096026.89396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317 `" && echo ansible-tmp-1727096026.892955-8246-194983147638317="` echo /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317 `" ) && sleep 0' 7530 1727096026.89933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096026.89948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096026.89963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096026.89985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096026.90092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096026.90121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096026.90137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096026.90204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096026.92200: stdout chunk (state=3): >>>ansible-tmp-1727096026.892955-8246-194983147638317=/root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317 <<< 7530 1727096026.92333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096026.92347: stdout chunk (state=3): >>><<< 7530 1727096026.92369: stderr chunk (state=3): >>><<< 7530 1727096026.92574: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096026.892955-8246-194983147638317=/root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096026.92578: variable 'ansible_module_compression' from source: unknown 7530 1727096026.92580: ANSIBALLZ: Using lock for ping 7530 1727096026.92583: ANSIBALLZ: Acquiring lock 7530 1727096026.92585: ANSIBALLZ: Lock acquired: 139837168248320 7530 1727096026.92587: ANSIBALLZ: Creating module 7530 1727096027.05227: ANSIBALLZ: Writing module into payload 7530 1727096027.05303: ANSIBALLZ: Writing module 7530 1727096027.05331: ANSIBALLZ: Renaming module 7530 1727096027.05342: ANSIBALLZ: Done creating module 7530 1727096027.05364: variable 'ansible_facts' from source: unknown 7530 1727096027.05446: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/AnsiballZ_ping.py 7530 1727096027.05696: Sending initial data 7530 1727096027.05699: Sent initial data (150 bytes) 7530 1727096027.06220: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096027.06233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.06248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.06258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096027.06271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096027.06323: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.06381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.06394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.06430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.06498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.08175: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096027.08266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096027.08275: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpe5e0lvc4 /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/AnsiballZ_ping.py <<< 7530 1727096027.08278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/AnsiballZ_ping.py" <<< 7530 1727096027.08320: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpe5e0lvc4" to remote "/root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/AnsiballZ_ping.py" <<< 7530 1727096027.09092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.09101: stdout chunk (state=3): >>><<< 7530 1727096027.09273: stderr chunk (state=3): >>><<< 7530 1727096027.09277: done transferring module to remote 7530 1727096027.09280: _low_level_execute_command(): starting 7530 1727096027.09283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/ /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/AnsiballZ_ping.py && sleep 0' 7530 1727096027.09898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096027.09912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.09961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096027.09979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.09994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.10073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.10106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.10127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.10151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.10237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.12189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.12194: stdout chunk (state=3): >>><<< 7530 1727096027.12196: stderr chunk (state=3): >>><<< 7530 1727096027.12307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096027.12311: _low_level_execute_command(): starting 7530 1727096027.12313: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/AnsiballZ_ping.py && sleep 0' 7530 1727096027.12932: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096027.12948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.12964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.13047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.13112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.13136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.13176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.13271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.29213: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7530 1727096027.30611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096027.30641: stderr chunk (state=3): >>><<< 7530 1727096027.30644: stdout chunk (state=3): >>><<< 7530 1727096027.30660: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096027.30682: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096027.30691: _low_level_execute_command(): starting 7530 1727096027.30695: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096026.892955-8246-194983147638317/ > /dev/null 2>&1 && sleep 0' 7530 1727096027.31132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.31139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.31156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.31159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.31220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.31225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.31231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.31265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.33127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.33153: stderr chunk (state=3): >>><<< 7530 1727096027.33156: stdout chunk (state=3): >>><<< 7530 1727096027.33172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096027.33179: handler run complete 7530 1727096027.33191: attempt loop complete, returning result 7530 1727096027.33194: _execute() done 7530 1727096027.33196: dumping result to json 7530 1727096027.33198: done dumping result, returning 7530 1727096027.33206: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-086b-f4f0-00000000002b] 7530 1727096027.33210: sending task result for task 0afff68d-5257-086b-f4f0-00000000002b 7530 1727096027.33301: done sending task result for task 0afff68d-5257-086b-f4f0-00000000002b 7530 1727096027.33303: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7530 1727096027.33363: no more pending results, returning what we have 7530 1727096027.33367: results queue empty 7530 1727096027.33375: checking for any_errors_fatal 7530 1727096027.33382: done checking for any_errors_fatal 7530 1727096027.33382: checking for max_fail_percentage 7530 1727096027.33384: done checking for max_fail_percentage 7530 1727096027.33385: checking to see if all hosts have failed and the running result is not ok 7530 1727096027.33386: done checking to see if all hosts have failed 7530 1727096027.33387: getting the remaining hosts for this loop 7530 1727096027.33388: done getting the remaining hosts for this loop 7530 1727096027.33391: getting the next task for host managed_node3 7530 1727096027.33401: done getting next task for host managed_node3 7530 1727096027.33404: ^ task is: TASK: meta (role_complete) 7530 1727096027.33406: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096027.33416: getting variables 7530 1727096027.33418: in VariableManager get_vars() 7530 1727096027.33471: Calling all_inventory to load vars for managed_node3 7530 1727096027.33474: Calling groups_inventory to load vars for managed_node3 7530 1727096027.33483: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.33494: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.33496: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.33499: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.34313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.35284: done with get_vars() 7530 1727096027.35302: done getting variables 7530 1727096027.35369: done queuing things up, now waiting for results queue to drain 7530 1727096027.35371: results queue empty 7530 1727096027.35371: checking for any_errors_fatal 7530 1727096027.35373: done checking for any_errors_fatal 7530 1727096027.35374: checking for max_fail_percentage 7530 1727096027.35374: done checking for max_fail_percentage 7530 1727096027.35375: checking to see if all hosts have failed and the running result is not ok 7530 1727096027.35375: done checking to see if all hosts have failed 7530 1727096027.35376: getting the remaining hosts for this loop 7530 1727096027.35376: done getting the remaining hosts for this loop 7530 1727096027.35378: getting the next task for host managed_node3 7530 1727096027.35381: done getting next task for host managed_node3 7530 1727096027.35383: ^ task is: TASK: Include the task 'assert_device_present.yml' 7530 1727096027.35384: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096027.35385: getting variables 7530 1727096027.35386: in VariableManager get_vars() 7530 1727096027.35400: Calling all_inventory to load vars for managed_node3 7530 1727096027.35401: Calling groups_inventory to load vars for managed_node3 7530 1727096027.35403: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.35406: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.35408: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.35409: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.36047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.36903: done with get_vars() 7530 1727096027.36920: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:42 Monday 23 September 2024 08:53:47 -0400 (0:00:00.523) 0:00:18.158 ****** 7530 1727096027.36983: entering _queue_task() for managed_node3/include_tasks 7530 1727096027.37243: worker is 1 (out of 1 available) 7530 1727096027.37256: exiting _queue_task() for managed_node3/include_tasks 7530 1727096027.37270: done queuing things up, now waiting for results queue to drain 7530 1727096027.37272: waiting for pending results... 7530 1727096027.37443: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7530 1727096027.37517: in run() - task 0afff68d-5257-086b-f4f0-00000000005b 7530 1727096027.37529: variable 'ansible_search_path' from source: unknown 7530 1727096027.37557: calling self._execute() 7530 1727096027.37633: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.37637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.37646: variable 'omit' from source: magic vars 7530 1727096027.37928: variable 'ansible_distribution_major_version' from source: facts 7530 1727096027.37936: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096027.37939: _execute() done 7530 1727096027.37947: dumping result to json 7530 1727096027.37950: done dumping result, returning 7530 1727096027.37953: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-086b-f4f0-00000000005b] 7530 1727096027.37955: sending task result for task 0afff68d-5257-086b-f4f0-00000000005b 7530 1727096027.38044: done sending task result for task 0afff68d-5257-086b-f4f0-00000000005b 7530 1727096027.38047: WORKER PROCESS EXITING 7530 1727096027.38075: no more pending results, returning what we have 7530 1727096027.38079: in VariableManager get_vars() 7530 1727096027.38137: Calling all_inventory to load vars for managed_node3 7530 1727096027.38140: Calling groups_inventory to load vars for managed_node3 7530 1727096027.38142: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.38155: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.38158: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.38160: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.39044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.40427: done with get_vars() 7530 1727096027.40455: variable 'ansible_search_path' from source: unknown 7530 1727096027.40475: we have included files to process 7530 1727096027.40476: generating all_blocks data 7530 1727096027.40479: done generating all_blocks data 7530 1727096027.40485: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096027.40486: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096027.40489: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096027.40608: in VariableManager get_vars() 7530 1727096027.40643: done with get_vars() 7530 1727096027.40774: done processing included file 7530 1727096027.40777: iterating over new_blocks loaded from include file 7530 1727096027.40779: in VariableManager get_vars() 7530 1727096027.40801: done with get_vars() 7530 1727096027.40803: filtering new block on tags 7530 1727096027.40819: done filtering new block on tags 7530 1727096027.40821: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7530 1727096027.40830: extending task lists for all hosts with included blocks 7530 1727096027.44897: done extending task lists 7530 1727096027.44900: done processing included files 7530 1727096027.44901: results queue empty 7530 1727096027.44901: checking for any_errors_fatal 7530 1727096027.44903: done checking for any_errors_fatal 7530 1727096027.44903: checking for max_fail_percentage 7530 1727096027.44905: done checking for max_fail_percentage 7530 1727096027.44905: checking to see if all hosts have failed and the running result is not ok 7530 1727096027.44906: done checking to see if all hosts have failed 7530 1727096027.44906: getting the remaining hosts for this loop 7530 1727096027.44907: done getting the remaining hosts for this loop 7530 1727096027.44909: getting the next task for host managed_node3 7530 1727096027.44912: done getting next task for host managed_node3 7530 1727096027.44914: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7530 1727096027.44916: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096027.44918: getting variables 7530 1727096027.44918: in VariableManager get_vars() 7530 1727096027.44938: Calling all_inventory to load vars for managed_node3 7530 1727096027.44940: Calling groups_inventory to load vars for managed_node3 7530 1727096027.44941: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.44946: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.44948: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.44949: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.45616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.50232: done with get_vars() 7530 1727096027.50253: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:53:47 -0400 (0:00:00.133) 0:00:18.291 ****** 7530 1727096027.50309: entering _queue_task() for managed_node3/include_tasks 7530 1727096027.50570: worker is 1 (out of 1 available) 7530 1727096027.50581: exiting _queue_task() for managed_node3/include_tasks 7530 1727096027.50593: done queuing things up, now waiting for results queue to drain 7530 1727096027.50595: waiting for pending results... 7530 1727096027.50777: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7530 1727096027.50855: in run() - task 0afff68d-5257-086b-f4f0-0000000008c2 7530 1727096027.50866: variable 'ansible_search_path' from source: unknown 7530 1727096027.50872: variable 'ansible_search_path' from source: unknown 7530 1727096027.50899: calling self._execute() 7530 1727096027.50978: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.50982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.50992: variable 'omit' from source: magic vars 7530 1727096027.51284: variable 'ansible_distribution_major_version' from source: facts 7530 1727096027.51293: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096027.51299: _execute() done 7530 1727096027.51302: dumping result to json 7530 1727096027.51304: done dumping result, returning 7530 1727096027.51311: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-086b-f4f0-0000000008c2] 7530 1727096027.51315: sending task result for task 0afff68d-5257-086b-f4f0-0000000008c2 7530 1727096027.51407: done sending task result for task 0afff68d-5257-086b-f4f0-0000000008c2 7530 1727096027.51410: WORKER PROCESS EXITING 7530 1727096027.51441: no more pending results, returning what we have 7530 1727096027.51448: in VariableManager get_vars() 7530 1727096027.51505: Calling all_inventory to load vars for managed_node3 7530 1727096027.51508: Calling groups_inventory to load vars for managed_node3 7530 1727096027.51510: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.51531: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.51535: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.51538: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.52695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.53809: done with get_vars() 7530 1727096027.53829: variable 'ansible_search_path' from source: unknown 7530 1727096027.53830: variable 'ansible_search_path' from source: unknown 7530 1727096027.53859: we have included files to process 7530 1727096027.53860: generating all_blocks data 7530 1727096027.53862: done generating all_blocks data 7530 1727096027.53863: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096027.53864: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096027.53865: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096027.54002: done processing included file 7530 1727096027.54003: iterating over new_blocks loaded from include file 7530 1727096027.54005: in VariableManager get_vars() 7530 1727096027.54022: done with get_vars() 7530 1727096027.54024: filtering new block on tags 7530 1727096027.54035: done filtering new block on tags 7530 1727096027.54037: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7530 1727096027.54041: extending task lists for all hosts with included blocks 7530 1727096027.54105: done extending task lists 7530 1727096027.54106: done processing included files 7530 1727096027.54107: results queue empty 7530 1727096027.54107: checking for any_errors_fatal 7530 1727096027.54110: done checking for any_errors_fatal 7530 1727096027.54110: checking for max_fail_percentage 7530 1727096027.54111: done checking for max_fail_percentage 7530 1727096027.54111: checking to see if all hosts have failed and the running result is not ok 7530 1727096027.54112: done checking to see if all hosts have failed 7530 1727096027.54112: getting the remaining hosts for this loop 7530 1727096027.54113: done getting the remaining hosts for this loop 7530 1727096027.54115: getting the next task for host managed_node3 7530 1727096027.54117: done getting next task for host managed_node3 7530 1727096027.54119: ^ task is: TASK: Get stat for interface {{ interface }} 7530 1727096027.54121: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096027.54123: getting variables 7530 1727096027.54124: in VariableManager get_vars() 7530 1727096027.54137: Calling all_inventory to load vars for managed_node3 7530 1727096027.54139: Calling groups_inventory to load vars for managed_node3 7530 1727096027.54140: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.54145: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.54146: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.54148: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.54836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.56078: done with get_vars() 7530 1727096027.56110: done getting variables 7530 1727096027.56283: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:53:47 -0400 (0:00:00.060) 0:00:18.351 ****** 7530 1727096027.56315: entering _queue_task() for managed_node3/stat 7530 1727096027.56647: worker is 1 (out of 1 available) 7530 1727096027.56658: exiting _queue_task() for managed_node3/stat 7530 1727096027.56670: done queuing things up, now waiting for results queue to drain 7530 1727096027.56672: waiting for pending results... 7530 1727096027.57087: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7530 1727096027.57131: in run() - task 0afff68d-5257-086b-f4f0-000000000ac6 7530 1727096027.57151: variable 'ansible_search_path' from source: unknown 7530 1727096027.57221: variable 'ansible_search_path' from source: unknown 7530 1727096027.57228: calling self._execute() 7530 1727096027.57303: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.57316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.57339: variable 'omit' from source: magic vars 7530 1727096027.57735: variable 'ansible_distribution_major_version' from source: facts 7530 1727096027.57754: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096027.57771: variable 'omit' from source: magic vars 7530 1727096027.57825: variable 'omit' from source: magic vars 7530 1727096027.57933: variable 'interface' from source: play vars 7530 1727096027.57984: variable 'omit' from source: magic vars 7530 1727096027.58006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096027.58048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096027.58074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096027.58172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096027.58176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096027.58178: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096027.58182: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.58185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.58271: Set connection var ansible_pipelining to False 7530 1727096027.58283: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096027.58293: Set connection var ansible_timeout to 10 7530 1727096027.58310: Set connection var ansible_shell_executable to /bin/sh 7530 1727096027.58317: Set connection var ansible_shell_type to sh 7530 1727096027.58325: Set connection var ansible_connection to ssh 7530 1727096027.58355: variable 'ansible_shell_executable' from source: unknown 7530 1727096027.58363: variable 'ansible_connection' from source: unknown 7530 1727096027.58371: variable 'ansible_module_compression' from source: unknown 7530 1727096027.58377: variable 'ansible_shell_type' from source: unknown 7530 1727096027.58383: variable 'ansible_shell_executable' from source: unknown 7530 1727096027.58389: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.58396: variable 'ansible_pipelining' from source: unknown 7530 1727096027.58401: variable 'ansible_timeout' from source: unknown 7530 1727096027.58415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.58633: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096027.58741: variable 'omit' from source: magic vars 7530 1727096027.58744: starting attempt loop 7530 1727096027.58747: running the handler 7530 1727096027.58748: _low_level_execute_command(): starting 7530 1727096027.58750: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096027.59432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096027.59491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.59570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.59600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.59620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.59700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.61413: stdout chunk (state=3): >>>/root <<< 7530 1727096027.61570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.61574: stdout chunk (state=3): >>><<< 7530 1727096027.61577: stderr chunk (state=3): >>><<< 7530 1727096027.61602: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096027.61628: _low_level_execute_command(): starting 7530 1727096027.61693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548 `" && echo ansible-tmp-1727096027.6160975-8268-66422070590548="` echo /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548 `" ) && sleep 0' 7530 1727096027.62292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096027.62350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.62419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.62481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.62515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.64516: stdout chunk (state=3): >>>ansible-tmp-1727096027.6160975-8268-66422070590548=/root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548 <<< 7530 1727096027.64697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.64701: stdout chunk (state=3): >>><<< 7530 1727096027.64704: stderr chunk (state=3): >>><<< 7530 1727096027.64775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096027.6160975-8268-66422070590548=/root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096027.64785: variable 'ansible_module_compression' from source: unknown 7530 1727096027.64861: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7530 1727096027.64915: variable 'ansible_facts' from source: unknown 7530 1727096027.65025: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/AnsiballZ_stat.py 7530 1727096027.65289: Sending initial data 7530 1727096027.65292: Sent initial data (150 bytes) 7530 1727096027.65772: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096027.65780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.65792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.65807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096027.65910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.65914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.65916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.65951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.66002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.67678: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096027.67702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096027.67739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpu42o3fkq /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/AnsiballZ_stat.py <<< 7530 1727096027.67742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/AnsiballZ_stat.py" <<< 7530 1727096027.67764: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpu42o3fkq" to remote "/root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/AnsiballZ_stat.py" <<< 7530 1727096027.67775: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/AnsiballZ_stat.py" <<< 7530 1727096027.68301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.68343: stderr chunk (state=3): >>><<< 7530 1727096027.68346: stdout chunk (state=3): >>><<< 7530 1727096027.68365: done transferring module to remote 7530 1727096027.68376: _low_level_execute_command(): starting 7530 1727096027.68381: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/ /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/AnsiballZ_stat.py && sleep 0' 7530 1727096027.68846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.68850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096027.68852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.68854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.68857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.68904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.68908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.68912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.68945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.70778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.70808: stderr chunk (state=3): >>><<< 7530 1727096027.70811: stdout chunk (state=3): >>><<< 7530 1727096027.70826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096027.70829: _low_level_execute_command(): starting 7530 1727096027.70832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/AnsiballZ_stat.py && sleep 0' 7530 1727096027.71263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.71266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096027.71297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096027.71300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096027.71302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.71304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.71357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.71360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.71373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.71426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.87110: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25177, "dev": 23, "nlink": 1, "atime": 1727096018.5148296, "mtime": 1727096018.5148296, "ctime": 1727096018.5148296, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7530 1727096027.88503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096027.88531: stderr chunk (state=3): >>><<< 7530 1727096027.88534: stdout chunk (state=3): >>><<< 7530 1727096027.88550: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25177, "dev": 23, "nlink": 1, "atime": 1727096018.5148296, "mtime": 1727096018.5148296, "ctime": 1727096018.5148296, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096027.88590: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096027.88599: _low_level_execute_command(): starting 7530 1727096027.88604: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096027.6160975-8268-66422070590548/ > /dev/null 2>&1 && sleep 0' 7530 1727096027.89045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096027.89085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096027.89088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.89090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096027.89092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096027.89138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096027.89142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096027.89150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096027.89194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096027.91047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096027.91073: stderr chunk (state=3): >>><<< 7530 1727096027.91076: stdout chunk (state=3): >>><<< 7530 1727096027.91094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096027.91101: handler run complete 7530 1727096027.91136: attempt loop complete, returning result 7530 1727096027.91139: _execute() done 7530 1727096027.91141: dumping result to json 7530 1727096027.91145: done dumping result, returning 7530 1727096027.91152: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0afff68d-5257-086b-f4f0-000000000ac6] 7530 1727096027.91156: sending task result for task 0afff68d-5257-086b-f4f0-000000000ac6 7530 1727096027.91262: done sending task result for task 0afff68d-5257-086b-f4f0-000000000ac6 7530 1727096027.91265: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096018.5148296, "block_size": 4096, "blocks": 0, "ctime": 1727096018.5148296, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25177, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727096018.5148296, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7530 1727096027.91353: no more pending results, returning what we have 7530 1727096027.91356: results queue empty 7530 1727096027.91357: checking for any_errors_fatal 7530 1727096027.91358: done checking for any_errors_fatal 7530 1727096027.91359: checking for max_fail_percentage 7530 1727096027.91361: done checking for max_fail_percentage 7530 1727096027.91361: checking to see if all hosts have failed and the running result is not ok 7530 1727096027.91362: done checking to see if all hosts have failed 7530 1727096027.91363: getting the remaining hosts for this loop 7530 1727096027.91364: done getting the remaining hosts for this loop 7530 1727096027.91370: getting the next task for host managed_node3 7530 1727096027.91379: done getting next task for host managed_node3 7530 1727096027.91382: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7530 1727096027.91384: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096027.91389: getting variables 7530 1727096027.91390: in VariableManager get_vars() 7530 1727096027.91434: Calling all_inventory to load vars for managed_node3 7530 1727096027.91437: Calling groups_inventory to load vars for managed_node3 7530 1727096027.91439: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.91449: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.91452: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.91454: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.92245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.93114: done with get_vars() 7530 1727096027.93132: done getting variables 7530 1727096027.93177: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096027.93266: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:53:47 -0400 (0:00:00.369) 0:00:18.721 ****** 7530 1727096027.93289: entering _queue_task() for managed_node3/assert 7530 1727096027.93529: worker is 1 (out of 1 available) 7530 1727096027.93540: exiting _queue_task() for managed_node3/assert 7530 1727096027.93552: done queuing things up, now waiting for results queue to drain 7530 1727096027.93554: waiting for pending results... 7530 1727096027.93732: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7530 1727096027.93803: in run() - task 0afff68d-5257-086b-f4f0-0000000008c3 7530 1727096027.93814: variable 'ansible_search_path' from source: unknown 7530 1727096027.93817: variable 'ansible_search_path' from source: unknown 7530 1727096027.93848: calling self._execute() 7530 1727096027.93926: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.93933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.93943: variable 'omit' from source: magic vars 7530 1727096027.94221: variable 'ansible_distribution_major_version' from source: facts 7530 1727096027.94234: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096027.94239: variable 'omit' from source: magic vars 7530 1727096027.94268: variable 'omit' from source: magic vars 7530 1727096027.94340: variable 'interface' from source: play vars 7530 1727096027.94353: variable 'omit' from source: magic vars 7530 1727096027.94386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096027.94413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096027.94432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096027.94448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096027.94456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096027.94482: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096027.94485: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.94488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.94558: Set connection var ansible_pipelining to False 7530 1727096027.94563: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096027.94571: Set connection var ansible_timeout to 10 7530 1727096027.94579: Set connection var ansible_shell_executable to /bin/sh 7530 1727096027.94582: Set connection var ansible_shell_type to sh 7530 1727096027.94584: Set connection var ansible_connection to ssh 7530 1727096027.94603: variable 'ansible_shell_executable' from source: unknown 7530 1727096027.94606: variable 'ansible_connection' from source: unknown 7530 1727096027.94608: variable 'ansible_module_compression' from source: unknown 7530 1727096027.94611: variable 'ansible_shell_type' from source: unknown 7530 1727096027.94613: variable 'ansible_shell_executable' from source: unknown 7530 1727096027.94615: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.94619: variable 'ansible_pipelining' from source: unknown 7530 1727096027.94621: variable 'ansible_timeout' from source: unknown 7530 1727096027.94628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.94729: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096027.94740: variable 'omit' from source: magic vars 7530 1727096027.94743: starting attempt loop 7530 1727096027.94747: running the handler 7530 1727096027.94841: variable 'interface_stat' from source: set_fact 7530 1727096027.94855: Evaluated conditional (interface_stat.stat.exists): True 7530 1727096027.94860: handler run complete 7530 1727096027.94873: attempt loop complete, returning result 7530 1727096027.94877: _execute() done 7530 1727096027.94880: dumping result to json 7530 1727096027.94882: done dumping result, returning 7530 1727096027.94889: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0afff68d-5257-086b-f4f0-0000000008c3] 7530 1727096027.94892: sending task result for task 0afff68d-5257-086b-f4f0-0000000008c3 7530 1727096027.94976: done sending task result for task 0afff68d-5257-086b-f4f0-0000000008c3 7530 1727096027.94979: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096027.95030: no more pending results, returning what we have 7530 1727096027.95033: results queue empty 7530 1727096027.95038: checking for any_errors_fatal 7530 1727096027.95047: done checking for any_errors_fatal 7530 1727096027.95048: checking for max_fail_percentage 7530 1727096027.95050: done checking for max_fail_percentage 7530 1727096027.95051: checking to see if all hosts have failed and the running result is not ok 7530 1727096027.95052: done checking to see if all hosts have failed 7530 1727096027.95052: getting the remaining hosts for this loop 7530 1727096027.95054: done getting the remaining hosts for this loop 7530 1727096027.95057: getting the next task for host managed_node3 7530 1727096027.95065: done getting next task for host managed_node3 7530 1727096027.95070: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7530 1727096027.95072: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096027.95076: getting variables 7530 1727096027.95077: in VariableManager get_vars() 7530 1727096027.95127: Calling all_inventory to load vars for managed_node3 7530 1727096027.95130: Calling groups_inventory to load vars for managed_node3 7530 1727096027.95132: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.95142: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.95145: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.95147: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.96039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.96891: done with get_vars() 7530 1727096027.96908: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:44 Monday 23 September 2024 08:53:47 -0400 (0:00:00.036) 0:00:18.758 ****** 7530 1727096027.96980: entering _queue_task() for managed_node3/include_tasks 7530 1727096027.97222: worker is 1 (out of 1 available) 7530 1727096027.97237: exiting _queue_task() for managed_node3/include_tasks 7530 1727096027.97249: done queuing things up, now waiting for results queue to drain 7530 1727096027.97250: waiting for pending results... 7530 1727096027.97429: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7530 1727096027.97497: in run() - task 0afff68d-5257-086b-f4f0-00000000005c 7530 1727096027.97509: variable 'ansible_search_path' from source: unknown 7530 1727096027.97541: calling self._execute() 7530 1727096027.97616: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096027.97622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096027.97633: variable 'omit' from source: magic vars 7530 1727096027.97918: variable 'ansible_distribution_major_version' from source: facts 7530 1727096027.97933: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096027.97939: _execute() done 7530 1727096027.97942: dumping result to json 7530 1727096027.97945: done dumping result, returning 7530 1727096027.97951: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-086b-f4f0-00000000005c] 7530 1727096027.97956: sending task result for task 0afff68d-5257-086b-f4f0-00000000005c 7530 1727096027.98052: done sending task result for task 0afff68d-5257-086b-f4f0-00000000005c 7530 1727096027.98055: WORKER PROCESS EXITING 7530 1727096027.98082: no more pending results, returning what we have 7530 1727096027.98087: in VariableManager get_vars() 7530 1727096027.98142: Calling all_inventory to load vars for managed_node3 7530 1727096027.98145: Calling groups_inventory to load vars for managed_node3 7530 1727096027.98147: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096027.98159: Calling all_plugins_play to load vars for managed_node3 7530 1727096027.98162: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096027.98165: Calling groups_plugins_play to load vars for managed_node3 7530 1727096027.98958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096027.99813: done with get_vars() 7530 1727096027.99829: variable 'ansible_search_path' from source: unknown 7530 1727096027.99841: we have included files to process 7530 1727096027.99842: generating all_blocks data 7530 1727096027.99843: done generating all_blocks data 7530 1727096027.99845: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7530 1727096027.99846: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7530 1727096027.99847: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7530 1727096027.99987: in VariableManager get_vars() 7530 1727096028.00007: done with get_vars() 7530 1727096028.00179: done processing included file 7530 1727096028.00180: iterating over new_blocks loaded from include file 7530 1727096028.00182: in VariableManager get_vars() 7530 1727096028.00196: done with get_vars() 7530 1727096028.00198: filtering new block on tags 7530 1727096028.00211: done filtering new block on tags 7530 1727096028.00212: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7530 1727096028.00216: extending task lists for all hosts with included blocks 7530 1727096028.02779: done extending task lists 7530 1727096028.02780: done processing included files 7530 1727096028.02781: results queue empty 7530 1727096028.02781: checking for any_errors_fatal 7530 1727096028.02783: done checking for any_errors_fatal 7530 1727096028.02784: checking for max_fail_percentage 7530 1727096028.02785: done checking for max_fail_percentage 7530 1727096028.02785: checking to see if all hosts have failed and the running result is not ok 7530 1727096028.02786: done checking to see if all hosts have failed 7530 1727096028.02786: getting the remaining hosts for this loop 7530 1727096028.02787: done getting the remaining hosts for this loop 7530 1727096028.02789: getting the next task for host managed_node3 7530 1727096028.02791: done getting next task for host managed_node3 7530 1727096028.02793: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7530 1727096028.02795: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096028.02796: getting variables 7530 1727096028.02797: in VariableManager get_vars() 7530 1727096028.02811: Calling all_inventory to load vars for managed_node3 7530 1727096028.02813: Calling groups_inventory to load vars for managed_node3 7530 1727096028.02814: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096028.02819: Calling all_plugins_play to load vars for managed_node3 7530 1727096028.02821: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096028.02822: Calling groups_plugins_play to load vars for managed_node3 7530 1727096028.03456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096028.04303: done with get_vars() 7530 1727096028.04318: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:53:48 -0400 (0:00:00.073) 0:00:18.832 ****** 7530 1727096028.04378: entering _queue_task() for managed_node3/include_tasks 7530 1727096028.04634: worker is 1 (out of 1 available) 7530 1727096028.04645: exiting _queue_task() for managed_node3/include_tasks 7530 1727096028.04656: done queuing things up, now waiting for results queue to drain 7530 1727096028.04658: waiting for pending results... 7530 1727096028.04834: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7530 1727096028.04908: in run() - task 0afff68d-5257-086b-f4f0-000000000ade 7530 1727096028.04920: variable 'ansible_search_path' from source: unknown 7530 1727096028.04922: variable 'ansible_search_path' from source: unknown 7530 1727096028.04954: calling self._execute() 7530 1727096028.05032: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.05037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.05046: variable 'omit' from source: magic vars 7530 1727096028.05323: variable 'ansible_distribution_major_version' from source: facts 7530 1727096028.05336: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096028.05340: _execute() done 7530 1727096028.05345: dumping result to json 7530 1727096028.05347: done dumping result, returning 7530 1727096028.05354: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-086b-f4f0-000000000ade] 7530 1727096028.05358: sending task result for task 0afff68d-5257-086b-f4f0-000000000ade 7530 1727096028.05444: done sending task result for task 0afff68d-5257-086b-f4f0-000000000ade 7530 1727096028.05446: WORKER PROCESS EXITING 7530 1727096028.05475: no more pending results, returning what we have 7530 1727096028.05480: in VariableManager get_vars() 7530 1727096028.05531: Calling all_inventory to load vars for managed_node3 7530 1727096028.05534: Calling groups_inventory to load vars for managed_node3 7530 1727096028.05536: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096028.05548: Calling all_plugins_play to load vars for managed_node3 7530 1727096028.05550: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096028.05553: Calling groups_plugins_play to load vars for managed_node3 7530 1727096028.06432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096028.07280: done with get_vars() 7530 1727096028.07294: variable 'ansible_search_path' from source: unknown 7530 1727096028.07295: variable 'ansible_search_path' from source: unknown 7530 1727096028.07323: we have included files to process 7530 1727096028.07324: generating all_blocks data 7530 1727096028.07326: done generating all_blocks data 7530 1727096028.07327: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7530 1727096028.07327: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7530 1727096028.07329: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7530 1727096028.08008: done processing included file 7530 1727096028.08010: iterating over new_blocks loaded from include file 7530 1727096028.08011: in VariableManager get_vars() 7530 1727096028.08029: done with get_vars() 7530 1727096028.08030: filtering new block on tags 7530 1727096028.08044: done filtering new block on tags 7530 1727096028.08045: in VariableManager get_vars() 7530 1727096028.08058: done with get_vars() 7530 1727096028.08059: filtering new block on tags 7530 1727096028.08075: done filtering new block on tags 7530 1727096028.08077: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7530 1727096028.08081: extending task lists for all hosts with included blocks 7530 1727096028.08174: done extending task lists 7530 1727096028.08176: done processing included files 7530 1727096028.08177: results queue empty 7530 1727096028.08177: checking for any_errors_fatal 7530 1727096028.08179: done checking for any_errors_fatal 7530 1727096028.08180: checking for max_fail_percentage 7530 1727096028.08180: done checking for max_fail_percentage 7530 1727096028.08181: checking to see if all hosts have failed and the running result is not ok 7530 1727096028.08181: done checking to see if all hosts have failed 7530 1727096028.08182: getting the remaining hosts for this loop 7530 1727096028.08183: done getting the remaining hosts for this loop 7530 1727096028.08184: getting the next task for host managed_node3 7530 1727096028.08187: done getting next task for host managed_node3 7530 1727096028.08188: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7530 1727096028.08190: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096028.08192: getting variables 7530 1727096028.08192: in VariableManager get_vars() 7530 1727096028.08244: Calling all_inventory to load vars for managed_node3 7530 1727096028.08246: Calling groups_inventory to load vars for managed_node3 7530 1727096028.08247: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096028.08251: Calling all_plugins_play to load vars for managed_node3 7530 1727096028.08253: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096028.08255: Calling groups_plugins_play to load vars for managed_node3 7530 1727096028.08861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096028.09757: done with get_vars() 7530 1727096028.09774: done getting variables 7530 1727096028.09809: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:53:48 -0400 (0:00:00.054) 0:00:18.886 ****** 7530 1727096028.09831: entering _queue_task() for managed_node3/set_fact 7530 1727096028.10083: worker is 1 (out of 1 available) 7530 1727096028.10097: exiting _queue_task() for managed_node3/set_fact 7530 1727096028.10109: done queuing things up, now waiting for results queue to drain 7530 1727096028.10111: waiting for pending results... 7530 1727096028.10292: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7530 1727096028.10370: in run() - task 0afff68d-5257-086b-f4f0-000000000cef 7530 1727096028.10382: variable 'ansible_search_path' from source: unknown 7530 1727096028.10385: variable 'ansible_search_path' from source: unknown 7530 1727096028.10413: calling self._execute() 7530 1727096028.10491: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.10495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.10504: variable 'omit' from source: magic vars 7530 1727096028.10784: variable 'ansible_distribution_major_version' from source: facts 7530 1727096028.10795: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096028.10800: variable 'omit' from source: magic vars 7530 1727096028.10832: variable 'omit' from source: magic vars 7530 1727096028.10857: variable 'omit' from source: magic vars 7530 1727096028.10894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096028.10920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096028.10938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096028.10952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096028.10962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096028.10989: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096028.10993: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.10996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.11066: Set connection var ansible_pipelining to False 7530 1727096028.11071: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096028.11078: Set connection var ansible_timeout to 10 7530 1727096028.11085: Set connection var ansible_shell_executable to /bin/sh 7530 1727096028.11088: Set connection var ansible_shell_type to sh 7530 1727096028.11090: Set connection var ansible_connection to ssh 7530 1727096028.11114: variable 'ansible_shell_executable' from source: unknown 7530 1727096028.11117: variable 'ansible_connection' from source: unknown 7530 1727096028.11120: variable 'ansible_module_compression' from source: unknown 7530 1727096028.11122: variable 'ansible_shell_type' from source: unknown 7530 1727096028.11125: variable 'ansible_shell_executable' from source: unknown 7530 1727096028.11127: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.11129: variable 'ansible_pipelining' from source: unknown 7530 1727096028.11131: variable 'ansible_timeout' from source: unknown 7530 1727096028.11133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.11238: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096028.11248: variable 'omit' from source: magic vars 7530 1727096028.11253: starting attempt loop 7530 1727096028.11256: running the handler 7530 1727096028.11266: handler run complete 7530 1727096028.11277: attempt loop complete, returning result 7530 1727096028.11280: _execute() done 7530 1727096028.11282: dumping result to json 7530 1727096028.11284: done dumping result, returning 7530 1727096028.11290: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-086b-f4f0-000000000cef] 7530 1727096028.11294: sending task result for task 0afff68d-5257-086b-f4f0-000000000cef 7530 1727096028.11380: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cef 7530 1727096028.11383: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7530 1727096028.11436: no more pending results, returning what we have 7530 1727096028.11439: results queue empty 7530 1727096028.11440: checking for any_errors_fatal 7530 1727096028.11441: done checking for any_errors_fatal 7530 1727096028.11442: checking for max_fail_percentage 7530 1727096028.11444: done checking for max_fail_percentage 7530 1727096028.11444: checking to see if all hosts have failed and the running result is not ok 7530 1727096028.11445: done checking to see if all hosts have failed 7530 1727096028.11446: getting the remaining hosts for this loop 7530 1727096028.11448: done getting the remaining hosts for this loop 7530 1727096028.11451: getting the next task for host managed_node3 7530 1727096028.11457: done getting next task for host managed_node3 7530 1727096028.11460: ^ task is: TASK: Stat profile file 7530 1727096028.11464: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096028.11470: getting variables 7530 1727096028.11472: in VariableManager get_vars() 7530 1727096028.11516: Calling all_inventory to load vars for managed_node3 7530 1727096028.11519: Calling groups_inventory to load vars for managed_node3 7530 1727096028.11521: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096028.11531: Calling all_plugins_play to load vars for managed_node3 7530 1727096028.11533: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096028.11536: Calling groups_plugins_play to load vars for managed_node3 7530 1727096028.12317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096028.13161: done with get_vars() 7530 1727096028.13179: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:53:48 -0400 (0:00:00.034) 0:00:18.920 ****** 7530 1727096028.13248: entering _queue_task() for managed_node3/stat 7530 1727096028.13487: worker is 1 (out of 1 available) 7530 1727096028.13501: exiting _queue_task() for managed_node3/stat 7530 1727096028.13514: done queuing things up, now waiting for results queue to drain 7530 1727096028.13516: waiting for pending results... 7530 1727096028.13693: running TaskExecutor() for managed_node3/TASK: Stat profile file 7530 1727096028.13772: in run() - task 0afff68d-5257-086b-f4f0-000000000cf0 7530 1727096028.13783: variable 'ansible_search_path' from source: unknown 7530 1727096028.13786: variable 'ansible_search_path' from source: unknown 7530 1727096028.13816: calling self._execute() 7530 1727096028.13893: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.13896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.13905: variable 'omit' from source: magic vars 7530 1727096028.14187: variable 'ansible_distribution_major_version' from source: facts 7530 1727096028.14194: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096028.14200: variable 'omit' from source: magic vars 7530 1727096028.14232: variable 'omit' from source: magic vars 7530 1727096028.14304: variable 'profile' from source: include params 7530 1727096028.14308: variable 'interface' from source: play vars 7530 1727096028.14360: variable 'interface' from source: play vars 7530 1727096028.14377: variable 'omit' from source: magic vars 7530 1727096028.14413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096028.14440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096028.14455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096028.14470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096028.14480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096028.14505: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096028.14508: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.14511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.14579: Set connection var ansible_pipelining to False 7530 1727096028.14585: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096028.14590: Set connection var ansible_timeout to 10 7530 1727096028.14597: Set connection var ansible_shell_executable to /bin/sh 7530 1727096028.14601: Set connection var ansible_shell_type to sh 7530 1727096028.14605: Set connection var ansible_connection to ssh 7530 1727096028.14630: variable 'ansible_shell_executable' from source: unknown 7530 1727096028.14633: variable 'ansible_connection' from source: unknown 7530 1727096028.14636: variable 'ansible_module_compression' from source: unknown 7530 1727096028.14638: variable 'ansible_shell_type' from source: unknown 7530 1727096028.14640: variable 'ansible_shell_executable' from source: unknown 7530 1727096028.14642: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.14644: variable 'ansible_pipelining' from source: unknown 7530 1727096028.14647: variable 'ansible_timeout' from source: unknown 7530 1727096028.14649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.14792: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096028.14801: variable 'omit' from source: magic vars 7530 1727096028.14805: starting attempt loop 7530 1727096028.14808: running the handler 7530 1727096028.14820: _low_level_execute_command(): starting 7530 1727096028.14830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096028.15330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096028.15347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096028.15351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.15363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.15428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.15431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.15433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.15477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.17133: stdout chunk (state=3): >>>/root <<< 7530 1727096028.17229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.17260: stderr chunk (state=3): >>><<< 7530 1727096028.17269: stdout chunk (state=3): >>><<< 7530 1727096028.17289: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096028.17301: _low_level_execute_command(): starting 7530 1727096028.17304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605 `" && echo ansible-tmp-1727096028.1728802-8287-195413813085605="` echo /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605 `" ) && sleep 0' 7530 1727096028.17773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096028.17776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.17787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096028.17791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096028.17795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.17836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.17840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.17842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.17887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.19826: stdout chunk (state=3): >>>ansible-tmp-1727096028.1728802-8287-195413813085605=/root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605 <<< 7530 1727096028.19918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.19954: stderr chunk (state=3): >>><<< 7530 1727096028.19958: stdout chunk (state=3): >>><<< 7530 1727096028.19978: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096028.1728802-8287-195413813085605=/root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096028.20016: variable 'ansible_module_compression' from source: unknown 7530 1727096028.20064: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7530 1727096028.20096: variable 'ansible_facts' from source: unknown 7530 1727096028.20145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/AnsiballZ_stat.py 7530 1727096028.20254: Sending initial data 7530 1727096028.20257: Sent initial data (151 bytes) 7530 1727096028.20718: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096028.20721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096028.20726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.20729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096028.20732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.20774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.20777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.20788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.20823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.22420: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096028.22451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096028.22484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpz3qjtzez /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/AnsiballZ_stat.py <<< 7530 1727096028.22491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/AnsiballZ_stat.py" <<< 7530 1727096028.22517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpz3qjtzez" to remote "/root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/AnsiballZ_stat.py" <<< 7530 1727096028.22521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/AnsiballZ_stat.py" <<< 7530 1727096028.23011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.23053: stderr chunk (state=3): >>><<< 7530 1727096028.23057: stdout chunk (state=3): >>><<< 7530 1727096028.23100: done transferring module to remote 7530 1727096028.23110: _low_level_execute_command(): starting 7530 1727096028.23114: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/ /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/AnsiballZ_stat.py && sleep 0' 7530 1727096028.23544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096028.23552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096028.23572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.23576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096028.23594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096028.23597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.23647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.23652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.23655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.23685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.25457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.25488: stderr chunk (state=3): >>><<< 7530 1727096028.25491: stdout chunk (state=3): >>><<< 7530 1727096028.25505: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096028.25507: _low_level_execute_command(): starting 7530 1727096028.25513: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/AnsiballZ_stat.py && sleep 0' 7530 1727096028.25934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096028.25949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096028.25952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.25964: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.26022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.26031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.26070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.41787: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7530 1727096028.43279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096028.43304: stdout chunk (state=3): >>><<< 7530 1727096028.43308: stderr chunk (state=3): >>><<< 7530 1727096028.43326: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096028.43451: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096028.43455: _low_level_execute_command(): starting 7530 1727096028.43458: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096028.1728802-8287-195413813085605/ > /dev/null 2>&1 && sleep 0' 7530 1727096028.44035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096028.44048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096028.44090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.44131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.44219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.44241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.44250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.44446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.46577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.46582: stdout chunk (state=3): >>><<< 7530 1727096028.46584: stderr chunk (state=3): >>><<< 7530 1727096028.46586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096028.46588: handler run complete 7530 1727096028.46590: attempt loop complete, returning result 7530 1727096028.46604: _execute() done 7530 1727096028.46606: dumping result to json 7530 1727096028.46614: done dumping result, returning 7530 1727096028.46624: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-086b-f4f0-000000000cf0] 7530 1727096028.46633: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf0 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7530 1727096028.46803: no more pending results, returning what we have 7530 1727096028.46807: results queue empty 7530 1727096028.46808: checking for any_errors_fatal 7530 1727096028.46814: done checking for any_errors_fatal 7530 1727096028.46815: checking for max_fail_percentage 7530 1727096028.46817: done checking for max_fail_percentage 7530 1727096028.46818: checking to see if all hosts have failed and the running result is not ok 7530 1727096028.46819: done checking to see if all hosts have failed 7530 1727096028.46820: getting the remaining hosts for this loop 7530 1727096028.46821: done getting the remaining hosts for this loop 7530 1727096028.46825: getting the next task for host managed_node3 7530 1727096028.46832: done getting next task for host managed_node3 7530 1727096028.46835: ^ task is: TASK: Set NM profile exist flag based on the profile files 7530 1727096028.46839: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096028.46844: getting variables 7530 1727096028.46845: in VariableManager get_vars() 7530 1727096028.46899: Calling all_inventory to load vars for managed_node3 7530 1727096028.46902: Calling groups_inventory to load vars for managed_node3 7530 1727096028.46904: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096028.46917: Calling all_plugins_play to load vars for managed_node3 7530 1727096028.46919: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096028.46922: Calling groups_plugins_play to load vars for managed_node3 7530 1727096028.48102: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf0 7530 1727096028.48107: WORKER PROCESS EXITING 7530 1727096028.50693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096028.53541: done with get_vars() 7530 1727096028.53578: done getting variables 7530 1727096028.53638: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:53:48 -0400 (0:00:00.404) 0:00:19.325 ****** 7530 1727096028.53666: entering _queue_task() for managed_node3/set_fact 7530 1727096028.54070: worker is 1 (out of 1 available) 7530 1727096028.54082: exiting _queue_task() for managed_node3/set_fact 7530 1727096028.54092: done queuing things up, now waiting for results queue to drain 7530 1727096028.54094: waiting for pending results... 7530 1727096028.54336: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7530 1727096028.54487: in run() - task 0afff68d-5257-086b-f4f0-000000000cf1 7530 1727096028.54508: variable 'ansible_search_path' from source: unknown 7530 1727096028.54516: variable 'ansible_search_path' from source: unknown 7530 1727096028.54555: calling self._execute() 7530 1727096028.54663: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.54677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.54697: variable 'omit' from source: magic vars 7530 1727096028.55216: variable 'ansible_distribution_major_version' from source: facts 7530 1727096028.55243: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096028.55380: variable 'profile_stat' from source: set_fact 7530 1727096028.55400: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096028.55408: when evaluation is False, skipping this task 7530 1727096028.55415: _execute() done 7530 1727096028.55421: dumping result to json 7530 1727096028.55431: done dumping result, returning 7530 1727096028.55441: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-086b-f4f0-000000000cf1] 7530 1727096028.55563: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf1 7530 1727096028.55634: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf1 7530 1727096028.55638: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096028.55717: no more pending results, returning what we have 7530 1727096028.55720: results queue empty 7530 1727096028.55721: checking for any_errors_fatal 7530 1727096028.55733: done checking for any_errors_fatal 7530 1727096028.55733: checking for max_fail_percentage 7530 1727096028.55735: done checking for max_fail_percentage 7530 1727096028.55736: checking to see if all hosts have failed and the running result is not ok 7530 1727096028.55737: done checking to see if all hosts have failed 7530 1727096028.55738: getting the remaining hosts for this loop 7530 1727096028.55739: done getting the remaining hosts for this loop 7530 1727096028.55743: getting the next task for host managed_node3 7530 1727096028.55750: done getting next task for host managed_node3 7530 1727096028.55753: ^ task is: TASK: Get NM profile info 7530 1727096028.55757: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096028.55763: getting variables 7530 1727096028.55764: in VariableManager get_vars() 7530 1727096028.55822: Calling all_inventory to load vars for managed_node3 7530 1727096028.55828: Calling groups_inventory to load vars for managed_node3 7530 1727096028.55831: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096028.55845: Calling all_plugins_play to load vars for managed_node3 7530 1727096028.55847: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096028.55850: Calling groups_plugins_play to load vars for managed_node3 7530 1727096028.58392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096028.60212: done with get_vars() 7530 1727096028.60248: done getting variables 7530 1727096028.60354: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:53:48 -0400 (0:00:00.067) 0:00:19.392 ****** 7530 1727096028.60389: entering _queue_task() for managed_node3/shell 7530 1727096028.60391: Creating lock for shell 7530 1727096028.60865: worker is 1 (out of 1 available) 7530 1727096028.60879: exiting _queue_task() for managed_node3/shell 7530 1727096028.60890: done queuing things up, now waiting for results queue to drain 7530 1727096028.60891: waiting for pending results... 7530 1727096028.61078: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7530 1727096028.61226: in run() - task 0afff68d-5257-086b-f4f0-000000000cf2 7530 1727096028.61230: variable 'ansible_search_path' from source: unknown 7530 1727096028.61234: variable 'ansible_search_path' from source: unknown 7530 1727096028.61270: calling self._execute() 7530 1727096028.61381: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.61477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.61481: variable 'omit' from source: magic vars 7530 1727096028.61912: variable 'ansible_distribution_major_version' from source: facts 7530 1727096028.61944: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096028.61991: variable 'omit' from source: magic vars 7530 1727096028.62013: variable 'omit' from source: magic vars 7530 1727096028.62135: variable 'profile' from source: include params 7530 1727096028.62146: variable 'interface' from source: play vars 7530 1727096028.62235: variable 'interface' from source: play vars 7530 1727096028.62269: variable 'omit' from source: magic vars 7530 1727096028.62375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096028.62378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096028.62389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096028.62411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096028.62436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096028.62473: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096028.62486: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.62494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.62609: Set connection var ansible_pipelining to False 7530 1727096028.62622: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096028.62697: Set connection var ansible_timeout to 10 7530 1727096028.62701: Set connection var ansible_shell_executable to /bin/sh 7530 1727096028.62704: Set connection var ansible_shell_type to sh 7530 1727096028.62706: Set connection var ansible_connection to ssh 7530 1727096028.62708: variable 'ansible_shell_executable' from source: unknown 7530 1727096028.62710: variable 'ansible_connection' from source: unknown 7530 1727096028.62712: variable 'ansible_module_compression' from source: unknown 7530 1727096028.62714: variable 'ansible_shell_type' from source: unknown 7530 1727096028.62716: variable 'ansible_shell_executable' from source: unknown 7530 1727096028.62718: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096028.62720: variable 'ansible_pipelining' from source: unknown 7530 1727096028.62734: variable 'ansible_timeout' from source: unknown 7530 1727096028.62742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096028.62893: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096028.62909: variable 'omit' from source: magic vars 7530 1727096028.62946: starting attempt loop 7530 1727096028.62949: running the handler 7530 1727096028.62952: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096028.63028: _low_level_execute_command(): starting 7530 1727096028.63031: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096028.63802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096028.63910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.63929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.64031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.66329: stdout chunk (state=3): >>>/root <<< 7530 1727096028.66333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.66336: stdout chunk (state=3): >>><<< 7530 1727096028.66339: stderr chunk (state=3): >>><<< 7530 1727096028.66341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096028.66345: _low_level_execute_command(): starting 7530 1727096028.66347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551 `" && echo ansible-tmp-1727096028.66232-8308-246786514629551="` echo /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551 `" ) && sleep 0' 7530 1727096028.67948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.68008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.68044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.70028: stdout chunk (state=3): >>>ansible-tmp-1727096028.66232-8308-246786514629551=/root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551 <<< 7530 1727096028.70329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.70333: stdout chunk (state=3): >>><<< 7530 1727096028.70335: stderr chunk (state=3): >>><<< 7530 1727096028.70342: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096028.66232-8308-246786514629551=/root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096028.70545: variable 'ansible_module_compression' from source: unknown 7530 1727096028.70562: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096028.70607: variable 'ansible_facts' from source: unknown 7530 1727096028.70872: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/AnsiballZ_command.py 7530 1727096028.71101: Sending initial data 7530 1727096028.71109: Sent initial data (152 bytes) 7530 1727096028.72489: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096028.72666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096028.72756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.72908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.72951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.74591: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096028.74646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096028.74696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpi9nobkr4 /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/AnsiballZ_command.py <<< 7530 1727096028.74706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/AnsiballZ_command.py" <<< 7530 1727096028.74763: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpi9nobkr4" to remote "/root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/AnsiballZ_command.py" <<< 7530 1727096028.75573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.75755: stderr chunk (state=3): >>><<< 7530 1727096028.75759: stdout chunk (state=3): >>><<< 7530 1727096028.75761: done transferring module to remote 7530 1727096028.75763: _low_level_execute_command(): starting 7530 1727096028.75766: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/ /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/AnsiballZ_command.py && sleep 0' 7530 1727096028.76472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096028.76488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096028.76550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096028.76656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096028.76679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.76701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.76844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096028.78676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096028.78680: stdout chunk (state=3): >>><<< 7530 1727096028.78682: stderr chunk (state=3): >>><<< 7530 1727096028.78693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096028.78704: _low_level_execute_command(): starting 7530 1727096028.78707: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/AnsiballZ_command.py && sleep 0' 7530 1727096028.79176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096028.79251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096028.79254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096028.79377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096028.79380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096028.79427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.03827: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-23 08:53:48.947069", "end": "2024-09-23 08:53:49.035897", "delta": "0:00:00.088828", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096029.05678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096029.05683: stderr chunk (state=3): >>><<< 7530 1727096029.05685: stdout chunk (state=3): >>><<< 7530 1727096029.05688: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-23 08:53:48.947069", "end": "2024-09-23 08:53:49.035897", "delta": "0:00:00.088828", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096029.05828: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096029.05833: _low_level_execute_command(): starting 7530 1727096029.05836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096028.66232-8308-246786514629551/ > /dev/null 2>&1 && sleep 0' 7530 1727096029.06843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096029.06870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096029.06873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096029.06876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096029.06917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096029.06921: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096029.07118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.07159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.07210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.09105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.09109: stdout chunk (state=3): >>><<< 7530 1727096029.09111: stderr chunk (state=3): >>><<< 7530 1727096029.09128: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096029.09273: handler run complete 7530 1727096029.09280: Evaluated conditional (False): False 7530 1727096029.09283: attempt loop complete, returning result 7530 1727096029.09285: _execute() done 7530 1727096029.09291: dumping result to json 7530 1727096029.09293: done dumping result, returning 7530 1727096029.09296: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-086b-f4f0-000000000cf2] 7530 1727096029.09458: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf2 7530 1727096029.09532: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf2 7530 1727096029.09535: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.088828", "end": "2024-09-23 08:53:49.035897", "rc": 0, "start": "2024-09-23 08:53:48.947069" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7530 1727096029.09653: no more pending results, returning what we have 7530 1727096029.09657: results queue empty 7530 1727096029.09658: checking for any_errors_fatal 7530 1727096029.09666: done checking for any_errors_fatal 7530 1727096029.09669: checking for max_fail_percentage 7530 1727096029.09671: done checking for max_fail_percentage 7530 1727096029.09672: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.09673: done checking to see if all hosts have failed 7530 1727096029.09674: getting the remaining hosts for this loop 7530 1727096029.09676: done getting the remaining hosts for this loop 7530 1727096029.09680: getting the next task for host managed_node3 7530 1727096029.09688: done getting next task for host managed_node3 7530 1727096029.09690: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7530 1727096029.09694: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.09698: getting variables 7530 1727096029.09700: in VariableManager get_vars() 7530 1727096029.09872: Calling all_inventory to load vars for managed_node3 7530 1727096029.09875: Calling groups_inventory to load vars for managed_node3 7530 1727096029.09877: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.09889: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.09891: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.09894: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.11765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.13270: done with get_vars() 7530 1727096029.13289: done getting variables 7530 1727096029.13339: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:53:49 -0400 (0:00:00.529) 0:00:19.921 ****** 7530 1727096029.13361: entering _queue_task() for managed_node3/set_fact 7530 1727096029.13601: worker is 1 (out of 1 available) 7530 1727096029.13612: exiting _queue_task() for managed_node3/set_fact 7530 1727096029.13628: done queuing things up, now waiting for results queue to drain 7530 1727096029.13630: waiting for pending results... 7530 1727096029.13804: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7530 1727096029.13880: in run() - task 0afff68d-5257-086b-f4f0-000000000cf3 7530 1727096029.13892: variable 'ansible_search_path' from source: unknown 7530 1727096029.13895: variable 'ansible_search_path' from source: unknown 7530 1727096029.13923: calling self._execute() 7530 1727096029.13999: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.14004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.14013: variable 'omit' from source: magic vars 7530 1727096029.14288: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.14297: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.14388: variable 'nm_profile_exists' from source: set_fact 7530 1727096029.14406: Evaluated conditional (nm_profile_exists.rc == 0): True 7530 1727096029.14409: variable 'omit' from source: magic vars 7530 1727096029.14440: variable 'omit' from source: magic vars 7530 1727096029.14462: variable 'omit' from source: magic vars 7530 1727096029.14497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096029.14528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096029.14542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096029.14555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.14564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.14594: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096029.14597: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.14600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.14686: Set connection var ansible_pipelining to False 7530 1727096029.14690: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096029.14692: Set connection var ansible_timeout to 10 7530 1727096029.14727: Set connection var ansible_shell_executable to /bin/sh 7530 1727096029.14731: Set connection var ansible_shell_type to sh 7530 1727096029.14734: Set connection var ansible_connection to ssh 7530 1727096029.14739: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.14741: variable 'ansible_connection' from source: unknown 7530 1727096029.14744: variable 'ansible_module_compression' from source: unknown 7530 1727096029.14746: variable 'ansible_shell_type' from source: unknown 7530 1727096029.14748: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.14750: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.14752: variable 'ansible_pipelining' from source: unknown 7530 1727096029.14754: variable 'ansible_timeout' from source: unknown 7530 1727096029.14757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.15074: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096029.15077: variable 'omit' from source: magic vars 7530 1727096029.15079: starting attempt loop 7530 1727096029.15081: running the handler 7530 1727096029.15083: handler run complete 7530 1727096029.15085: attempt loop complete, returning result 7530 1727096029.15087: _execute() done 7530 1727096029.15089: dumping result to json 7530 1727096029.15091: done dumping result, returning 7530 1727096029.15093: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-086b-f4f0-000000000cf3] 7530 1727096029.15099: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf3 7530 1727096029.15169: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf3 7530 1727096029.15172: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7530 1727096029.15225: no more pending results, returning what we have 7530 1727096029.15228: results queue empty 7530 1727096029.15229: checking for any_errors_fatal 7530 1727096029.15240: done checking for any_errors_fatal 7530 1727096029.15241: checking for max_fail_percentage 7530 1727096029.15243: done checking for max_fail_percentage 7530 1727096029.15244: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.15245: done checking to see if all hosts have failed 7530 1727096029.15246: getting the remaining hosts for this loop 7530 1727096029.15247: done getting the remaining hosts for this loop 7530 1727096029.15251: getting the next task for host managed_node3 7530 1727096029.15260: done getting next task for host managed_node3 7530 1727096029.15262: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7530 1727096029.15266: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.15272: getting variables 7530 1727096029.15274: in VariableManager get_vars() 7530 1727096029.15320: Calling all_inventory to load vars for managed_node3 7530 1727096029.15323: Calling groups_inventory to load vars for managed_node3 7530 1727096029.15325: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.15335: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.15337: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.15340: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.16579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.17865: done with get_vars() 7530 1727096029.17887: done getting variables 7530 1727096029.17934: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096029.18019: variable 'profile' from source: include params 7530 1727096029.18026: variable 'interface' from source: play vars 7530 1727096029.18073: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:53:49 -0400 (0:00:00.047) 0:00:19.969 ****** 7530 1727096029.18100: entering _queue_task() for managed_node3/command 7530 1727096029.18338: worker is 1 (out of 1 available) 7530 1727096029.18349: exiting _queue_task() for managed_node3/command 7530 1727096029.18361: done queuing things up, now waiting for results queue to drain 7530 1727096029.18362: waiting for pending results... 7530 1727096029.18538: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7530 1727096029.18621: in run() - task 0afff68d-5257-086b-f4f0-000000000cf5 7530 1727096029.18634: variable 'ansible_search_path' from source: unknown 7530 1727096029.18637: variable 'ansible_search_path' from source: unknown 7530 1727096029.18664: calling self._execute() 7530 1727096029.18741: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.18746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.18755: variable 'omit' from source: magic vars 7530 1727096029.19029: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.19037: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.19122: variable 'profile_stat' from source: set_fact 7530 1727096029.19137: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096029.19140: when evaluation is False, skipping this task 7530 1727096029.19143: _execute() done 7530 1727096029.19145: dumping result to json 7530 1727096029.19147: done dumping result, returning 7530 1727096029.19149: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000000cf5] 7530 1727096029.19155: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf5 7530 1727096029.19234: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf5 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096029.19292: no more pending results, returning what we have 7530 1727096029.19295: results queue empty 7530 1727096029.19296: checking for any_errors_fatal 7530 1727096029.19305: done checking for any_errors_fatal 7530 1727096029.19306: checking for max_fail_percentage 7530 1727096029.19308: done checking for max_fail_percentage 7530 1727096029.19308: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.19309: done checking to see if all hosts have failed 7530 1727096029.19310: getting the remaining hosts for this loop 7530 1727096029.19313: done getting the remaining hosts for this loop 7530 1727096029.19317: getting the next task for host managed_node3 7530 1727096029.19325: done getting next task for host managed_node3 7530 1727096029.19328: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7530 1727096029.19331: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.19335: getting variables 7530 1727096029.19336: in VariableManager get_vars() 7530 1727096029.19382: Calling all_inventory to load vars for managed_node3 7530 1727096029.19385: Calling groups_inventory to load vars for managed_node3 7530 1727096029.19387: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.19397: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.19400: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.19403: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.19982: WORKER PROCESS EXITING 7530 1727096029.20746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.21605: done with get_vars() 7530 1727096029.21626: done getting variables 7530 1727096029.21674: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096029.21760: variable 'profile' from source: include params 7530 1727096029.21763: variable 'interface' from source: play vars 7530 1727096029.21807: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:53:49 -0400 (0:00:00.037) 0:00:20.006 ****** 7530 1727096029.21832: entering _queue_task() for managed_node3/set_fact 7530 1727096029.22086: worker is 1 (out of 1 available) 7530 1727096029.22099: exiting _queue_task() for managed_node3/set_fact 7530 1727096029.22112: done queuing things up, now waiting for results queue to drain 7530 1727096029.22114: waiting for pending results... 7530 1727096029.22290: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7530 1727096029.22371: in run() - task 0afff68d-5257-086b-f4f0-000000000cf6 7530 1727096029.22384: variable 'ansible_search_path' from source: unknown 7530 1727096029.22387: variable 'ansible_search_path' from source: unknown 7530 1727096029.22413: calling self._execute() 7530 1727096029.22490: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.22496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.22504: variable 'omit' from source: magic vars 7530 1727096029.22776: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.22788: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.22874: variable 'profile_stat' from source: set_fact 7530 1727096029.22888: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096029.22892: when evaluation is False, skipping this task 7530 1727096029.22895: _execute() done 7530 1727096029.22897: dumping result to json 7530 1727096029.22899: done dumping result, returning 7530 1727096029.22907: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000000cf6] 7530 1727096029.22910: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf6 7530 1727096029.22995: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf6 7530 1727096029.22998: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096029.23051: no more pending results, returning what we have 7530 1727096029.23055: results queue empty 7530 1727096029.23055: checking for any_errors_fatal 7530 1727096029.23062: done checking for any_errors_fatal 7530 1727096029.23063: checking for max_fail_percentage 7530 1727096029.23065: done checking for max_fail_percentage 7530 1727096029.23065: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.23066: done checking to see if all hosts have failed 7530 1727096029.23069: getting the remaining hosts for this loop 7530 1727096029.23070: done getting the remaining hosts for this loop 7530 1727096029.23073: getting the next task for host managed_node3 7530 1727096029.23081: done getting next task for host managed_node3 7530 1727096029.23083: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7530 1727096029.23088: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.23092: getting variables 7530 1727096029.23094: in VariableManager get_vars() 7530 1727096029.23147: Calling all_inventory to load vars for managed_node3 7530 1727096029.23150: Calling groups_inventory to load vars for managed_node3 7530 1727096029.23152: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.23163: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.23165: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.23169: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.23936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.24869: done with get_vars() 7530 1727096029.24885: done getting variables 7530 1727096029.24928: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096029.25010: variable 'profile' from source: include params 7530 1727096029.25013: variable 'interface' from source: play vars 7530 1727096029.25052: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:53:49 -0400 (0:00:00.032) 0:00:20.039 ****** 7530 1727096029.25079: entering _queue_task() for managed_node3/command 7530 1727096029.25312: worker is 1 (out of 1 available) 7530 1727096029.25324: exiting _queue_task() for managed_node3/command 7530 1727096029.25336: done queuing things up, now waiting for results queue to drain 7530 1727096029.25338: waiting for pending results... 7530 1727096029.25517: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7530 1727096029.25603: in run() - task 0afff68d-5257-086b-f4f0-000000000cf7 7530 1727096029.25613: variable 'ansible_search_path' from source: unknown 7530 1727096029.25617: variable 'ansible_search_path' from source: unknown 7530 1727096029.25646: calling self._execute() 7530 1727096029.25721: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.25724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.25736: variable 'omit' from source: magic vars 7530 1727096029.26014: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.26023: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.26113: variable 'profile_stat' from source: set_fact 7530 1727096029.26121: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096029.26123: when evaluation is False, skipping this task 7530 1727096029.26129: _execute() done 7530 1727096029.26132: dumping result to json 7530 1727096029.26134: done dumping result, returning 7530 1727096029.26141: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000000cf7] 7530 1727096029.26146: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf7 7530 1727096029.26226: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf7 7530 1727096029.26229: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096029.26280: no more pending results, returning what we have 7530 1727096029.26283: results queue empty 7530 1727096029.26284: checking for any_errors_fatal 7530 1727096029.26291: done checking for any_errors_fatal 7530 1727096029.26292: checking for max_fail_percentage 7530 1727096029.26293: done checking for max_fail_percentage 7530 1727096029.26294: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.26295: done checking to see if all hosts have failed 7530 1727096029.26296: getting the remaining hosts for this loop 7530 1727096029.26297: done getting the remaining hosts for this loop 7530 1727096029.26300: getting the next task for host managed_node3 7530 1727096029.26307: done getting next task for host managed_node3 7530 1727096029.26310: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7530 1727096029.26313: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.26317: getting variables 7530 1727096029.26319: in VariableManager get_vars() 7530 1727096029.26361: Calling all_inventory to load vars for managed_node3 7530 1727096029.26364: Calling groups_inventory to load vars for managed_node3 7530 1727096029.26366: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.26379: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.26381: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.26384: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.27142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.27988: done with get_vars() 7530 1727096029.28004: done getting variables 7530 1727096029.28047: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096029.28127: variable 'profile' from source: include params 7530 1727096029.28130: variable 'interface' from source: play vars 7530 1727096029.28170: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:53:49 -0400 (0:00:00.031) 0:00:20.070 ****** 7530 1727096029.28194: entering _queue_task() for managed_node3/set_fact 7530 1727096029.28423: worker is 1 (out of 1 available) 7530 1727096029.28435: exiting _queue_task() for managed_node3/set_fact 7530 1727096029.28449: done queuing things up, now waiting for results queue to drain 7530 1727096029.28451: waiting for pending results... 7530 1727096029.28628: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7530 1727096029.28711: in run() - task 0afff68d-5257-086b-f4f0-000000000cf8 7530 1727096029.28721: variable 'ansible_search_path' from source: unknown 7530 1727096029.28724: variable 'ansible_search_path' from source: unknown 7530 1727096029.28753: calling self._execute() 7530 1727096029.28830: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.28836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.28844: variable 'omit' from source: magic vars 7530 1727096029.29118: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.29130: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.29214: variable 'profile_stat' from source: set_fact 7530 1727096029.29226: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096029.29233: when evaluation is False, skipping this task 7530 1727096029.29236: _execute() done 7530 1727096029.29239: dumping result to json 7530 1727096029.29241: done dumping result, returning 7530 1727096029.29247: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000000cf8] 7530 1727096029.29251: sending task result for task 0afff68d-5257-086b-f4f0-000000000cf8 7530 1727096029.29337: done sending task result for task 0afff68d-5257-086b-f4f0-000000000cf8 7530 1727096029.29339: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096029.29388: no more pending results, returning what we have 7530 1727096029.29392: results queue empty 7530 1727096029.29392: checking for any_errors_fatal 7530 1727096029.29398: done checking for any_errors_fatal 7530 1727096029.29399: checking for max_fail_percentage 7530 1727096029.29401: done checking for max_fail_percentage 7530 1727096029.29402: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.29403: done checking to see if all hosts have failed 7530 1727096029.29404: getting the remaining hosts for this loop 7530 1727096029.29405: done getting the remaining hosts for this loop 7530 1727096029.29408: getting the next task for host managed_node3 7530 1727096029.29415: done getting next task for host managed_node3 7530 1727096029.29418: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7530 1727096029.29421: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.29427: getting variables 7530 1727096029.29429: in VariableManager get_vars() 7530 1727096029.29484: Calling all_inventory to load vars for managed_node3 7530 1727096029.29487: Calling groups_inventory to load vars for managed_node3 7530 1727096029.29489: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.29500: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.29502: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.29504: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.30400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.31236: done with get_vars() 7530 1727096029.31254: done getting variables 7530 1727096029.31303: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096029.31390: variable 'profile' from source: include params 7530 1727096029.31393: variable 'interface' from source: play vars 7530 1727096029.31435: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:53:49 -0400 (0:00:00.032) 0:00:20.102 ****** 7530 1727096029.31459: entering _queue_task() for managed_node3/assert 7530 1727096029.31707: worker is 1 (out of 1 available) 7530 1727096029.31720: exiting _queue_task() for managed_node3/assert 7530 1727096029.31732: done queuing things up, now waiting for results queue to drain 7530 1727096029.31734: waiting for pending results... 7530 1727096029.31918: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7530 1727096029.31996: in run() - task 0afff68d-5257-086b-f4f0-000000000adf 7530 1727096029.32007: variable 'ansible_search_path' from source: unknown 7530 1727096029.32011: variable 'ansible_search_path' from source: unknown 7530 1727096029.32041: calling self._execute() 7530 1727096029.32122: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.32129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.32138: variable 'omit' from source: magic vars 7530 1727096029.32413: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.32424: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.32432: variable 'omit' from source: magic vars 7530 1727096029.32461: variable 'omit' from source: magic vars 7530 1727096029.32538: variable 'profile' from source: include params 7530 1727096029.32542: variable 'interface' from source: play vars 7530 1727096029.32587: variable 'interface' from source: play vars 7530 1727096029.32605: variable 'omit' from source: magic vars 7530 1727096029.32642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096029.32669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096029.32686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096029.32699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.32708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.32737: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096029.32741: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.32743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.32811: Set connection var ansible_pipelining to False 7530 1727096029.32816: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096029.32821: Set connection var ansible_timeout to 10 7530 1727096029.32835: Set connection var ansible_shell_executable to /bin/sh 7530 1727096029.32838: Set connection var ansible_shell_type to sh 7530 1727096029.32841: Set connection var ansible_connection to ssh 7530 1727096029.32859: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.32862: variable 'ansible_connection' from source: unknown 7530 1727096029.32864: variable 'ansible_module_compression' from source: unknown 7530 1727096029.32867: variable 'ansible_shell_type' from source: unknown 7530 1727096029.32870: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.32873: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.32876: variable 'ansible_pipelining' from source: unknown 7530 1727096029.32878: variable 'ansible_timeout' from source: unknown 7530 1727096029.32883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.32986: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096029.32996: variable 'omit' from source: magic vars 7530 1727096029.33002: starting attempt loop 7530 1727096029.33004: running the handler 7530 1727096029.33088: variable 'lsr_net_profile_exists' from source: set_fact 7530 1727096029.33091: Evaluated conditional (lsr_net_profile_exists): True 7530 1727096029.33098: handler run complete 7530 1727096029.33109: attempt loop complete, returning result 7530 1727096029.33111: _execute() done 7530 1727096029.33114: dumping result to json 7530 1727096029.33117: done dumping result, returning 7530 1727096029.33122: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [0afff68d-5257-086b-f4f0-000000000adf] 7530 1727096029.33129: sending task result for task 0afff68d-5257-086b-f4f0-000000000adf 7530 1727096029.33209: done sending task result for task 0afff68d-5257-086b-f4f0-000000000adf 7530 1727096029.33212: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096029.33257: no more pending results, returning what we have 7530 1727096029.33260: results queue empty 7530 1727096029.33261: checking for any_errors_fatal 7530 1727096029.33266: done checking for any_errors_fatal 7530 1727096029.33266: checking for max_fail_percentage 7530 1727096029.33270: done checking for max_fail_percentage 7530 1727096029.33271: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.33272: done checking to see if all hosts have failed 7530 1727096029.33273: getting the remaining hosts for this loop 7530 1727096029.33274: done getting the remaining hosts for this loop 7530 1727096029.33277: getting the next task for host managed_node3 7530 1727096029.33284: done getting next task for host managed_node3 7530 1727096029.33286: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7530 1727096029.33289: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.33293: getting variables 7530 1727096029.33295: in VariableManager get_vars() 7530 1727096029.33345: Calling all_inventory to load vars for managed_node3 7530 1727096029.33348: Calling groups_inventory to load vars for managed_node3 7530 1727096029.33350: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.33361: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.33363: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.33366: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.34165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.35011: done with get_vars() 7530 1727096029.35030: done getting variables 7530 1727096029.35076: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096029.35162: variable 'profile' from source: include params 7530 1727096029.35165: variable 'interface' from source: play vars 7530 1727096029.35205: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:53:49 -0400 (0:00:00.037) 0:00:20.140 ****** 7530 1727096029.35235: entering _queue_task() for managed_node3/assert 7530 1727096029.35483: worker is 1 (out of 1 available) 7530 1727096029.35496: exiting _queue_task() for managed_node3/assert 7530 1727096029.35509: done queuing things up, now waiting for results queue to drain 7530 1727096029.35511: waiting for pending results... 7530 1727096029.35694: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7530 1727096029.35772: in run() - task 0afff68d-5257-086b-f4f0-000000000ae0 7530 1727096029.35782: variable 'ansible_search_path' from source: unknown 7530 1727096029.35785: variable 'ansible_search_path' from source: unknown 7530 1727096029.35815: calling self._execute() 7530 1727096029.35898: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.35901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.35910: variable 'omit' from source: magic vars 7530 1727096029.36183: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.36194: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.36197: variable 'omit' from source: magic vars 7530 1727096029.36231: variable 'omit' from source: magic vars 7530 1727096029.36303: variable 'profile' from source: include params 7530 1727096029.36306: variable 'interface' from source: play vars 7530 1727096029.36354: variable 'interface' from source: play vars 7530 1727096029.36371: variable 'omit' from source: magic vars 7530 1727096029.36407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096029.36437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096029.36453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096029.36468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.36479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.36504: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096029.36507: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.36510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.36585: Set connection var ansible_pipelining to False 7530 1727096029.36588: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096029.36595: Set connection var ansible_timeout to 10 7530 1727096029.36604: Set connection var ansible_shell_executable to /bin/sh 7530 1727096029.36607: Set connection var ansible_shell_type to sh 7530 1727096029.36610: Set connection var ansible_connection to ssh 7530 1727096029.36630: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.36634: variable 'ansible_connection' from source: unknown 7530 1727096029.36638: variable 'ansible_module_compression' from source: unknown 7530 1727096029.36641: variable 'ansible_shell_type' from source: unknown 7530 1727096029.36643: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.36645: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.36648: variable 'ansible_pipelining' from source: unknown 7530 1727096029.36650: variable 'ansible_timeout' from source: unknown 7530 1727096029.36652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.36758: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096029.36771: variable 'omit' from source: magic vars 7530 1727096029.36775: starting attempt loop 7530 1727096029.36777: running the handler 7530 1727096029.36857: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7530 1727096029.36860: Evaluated conditional (lsr_net_profile_ansible_managed): True 7530 1727096029.36867: handler run complete 7530 1727096029.36883: attempt loop complete, returning result 7530 1727096029.36886: _execute() done 7530 1727096029.36888: dumping result to json 7530 1727096029.36891: done dumping result, returning 7530 1727096029.36896: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0afff68d-5257-086b-f4f0-000000000ae0] 7530 1727096029.36901: sending task result for task 0afff68d-5257-086b-f4f0-000000000ae0 7530 1727096029.36982: done sending task result for task 0afff68d-5257-086b-f4f0-000000000ae0 7530 1727096029.36985: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096029.37030: no more pending results, returning what we have 7530 1727096029.37033: results queue empty 7530 1727096029.37033: checking for any_errors_fatal 7530 1727096029.37039: done checking for any_errors_fatal 7530 1727096029.37040: checking for max_fail_percentage 7530 1727096029.37041: done checking for max_fail_percentage 7530 1727096029.37042: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.37043: done checking to see if all hosts have failed 7530 1727096029.37044: getting the remaining hosts for this loop 7530 1727096029.37045: done getting the remaining hosts for this loop 7530 1727096029.37048: getting the next task for host managed_node3 7530 1727096029.37055: done getting next task for host managed_node3 7530 1727096029.37057: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7530 1727096029.37060: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.37063: getting variables 7530 1727096029.37064: in VariableManager get_vars() 7530 1727096029.37116: Calling all_inventory to load vars for managed_node3 7530 1727096029.37119: Calling groups_inventory to load vars for managed_node3 7530 1727096029.37122: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.37133: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.37135: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.37138: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.41729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.42563: done with get_vars() 7530 1727096029.42586: done getting variables 7530 1727096029.42625: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096029.42695: variable 'profile' from source: include params 7530 1727096029.42697: variable 'interface' from source: play vars 7530 1727096029.42740: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:53:49 -0400 (0:00:00.075) 0:00:20.216 ****** 7530 1727096029.42764: entering _queue_task() for managed_node3/assert 7530 1727096029.43021: worker is 1 (out of 1 available) 7530 1727096029.43034: exiting _queue_task() for managed_node3/assert 7530 1727096029.43046: done queuing things up, now waiting for results queue to drain 7530 1727096029.43048: waiting for pending results... 7530 1727096029.43233: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7530 1727096029.43321: in run() - task 0afff68d-5257-086b-f4f0-000000000ae1 7530 1727096029.43334: variable 'ansible_search_path' from source: unknown 7530 1727096029.43338: variable 'ansible_search_path' from source: unknown 7530 1727096029.43367: calling self._execute() 7530 1727096029.43449: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.43454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.43463: variable 'omit' from source: magic vars 7530 1727096029.43754: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.43765: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.43772: variable 'omit' from source: magic vars 7530 1727096029.43798: variable 'omit' from source: magic vars 7530 1727096029.43874: variable 'profile' from source: include params 7530 1727096029.43878: variable 'interface' from source: play vars 7530 1727096029.43930: variable 'interface' from source: play vars 7530 1727096029.43943: variable 'omit' from source: magic vars 7530 1727096029.43978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096029.44004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096029.44020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096029.44037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.44049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.44074: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096029.44078: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.44080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.44151: Set connection var ansible_pipelining to False 7530 1727096029.44155: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096029.44161: Set connection var ansible_timeout to 10 7530 1727096029.44170: Set connection var ansible_shell_executable to /bin/sh 7530 1727096029.44174: Set connection var ansible_shell_type to sh 7530 1727096029.44176: Set connection var ansible_connection to ssh 7530 1727096029.44195: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.44198: variable 'ansible_connection' from source: unknown 7530 1727096029.44201: variable 'ansible_module_compression' from source: unknown 7530 1727096029.44204: variable 'ansible_shell_type' from source: unknown 7530 1727096029.44206: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.44209: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.44211: variable 'ansible_pipelining' from source: unknown 7530 1727096029.44214: variable 'ansible_timeout' from source: unknown 7530 1727096029.44219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.44323: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096029.44334: variable 'omit' from source: magic vars 7530 1727096029.44339: starting attempt loop 7530 1727096029.44342: running the handler 7530 1727096029.44420: variable 'lsr_net_profile_fingerprint' from source: set_fact 7530 1727096029.44424: Evaluated conditional (lsr_net_profile_fingerprint): True 7530 1727096029.44432: handler run complete 7530 1727096029.44443: attempt loop complete, returning result 7530 1727096029.44446: _execute() done 7530 1727096029.44448: dumping result to json 7530 1727096029.44451: done dumping result, returning 7530 1727096029.44456: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [0afff68d-5257-086b-f4f0-000000000ae1] 7530 1727096029.44461: sending task result for task 0afff68d-5257-086b-f4f0-000000000ae1 7530 1727096029.44541: done sending task result for task 0afff68d-5257-086b-f4f0-000000000ae1 7530 1727096029.44543: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096029.44614: no more pending results, returning what we have 7530 1727096029.44617: results queue empty 7530 1727096029.44617: checking for any_errors_fatal 7530 1727096029.44625: done checking for any_errors_fatal 7530 1727096029.44625: checking for max_fail_percentage 7530 1727096029.44627: done checking for max_fail_percentage 7530 1727096029.44628: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.44629: done checking to see if all hosts have failed 7530 1727096029.44629: getting the remaining hosts for this loop 7530 1727096029.44631: done getting the remaining hosts for this loop 7530 1727096029.44634: getting the next task for host managed_node3 7530 1727096029.44642: done getting next task for host managed_node3 7530 1727096029.44645: ^ task is: TASK: Show ipv4 routes 7530 1727096029.44646: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.44650: getting variables 7530 1727096029.44651: in VariableManager get_vars() 7530 1727096029.44699: Calling all_inventory to load vars for managed_node3 7530 1727096029.44702: Calling groups_inventory to load vars for managed_node3 7530 1727096029.44704: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.44714: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.44717: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.44719: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.45494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.46355: done with get_vars() 7530 1727096029.46373: done getting variables 7530 1727096029.46417: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:48 Monday 23 September 2024 08:53:49 -0400 (0:00:00.036) 0:00:20.252 ****** 7530 1727096029.46440: entering _queue_task() for managed_node3/command 7530 1727096029.46680: worker is 1 (out of 1 available) 7530 1727096029.46693: exiting _queue_task() for managed_node3/command 7530 1727096029.46705: done queuing things up, now waiting for results queue to drain 7530 1727096029.46706: waiting for pending results... 7530 1727096029.46893: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7530 1727096029.46958: in run() - task 0afff68d-5257-086b-f4f0-00000000005d 7530 1727096029.46972: variable 'ansible_search_path' from source: unknown 7530 1727096029.47003: calling self._execute() 7530 1727096029.47091: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.47097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.47105: variable 'omit' from source: magic vars 7530 1727096029.47391: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.47403: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.47408: variable 'omit' from source: magic vars 7530 1727096029.47423: variable 'omit' from source: magic vars 7530 1727096029.47451: variable 'omit' from source: magic vars 7530 1727096029.47488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096029.47514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096029.47533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096029.47546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.47556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.47582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096029.47586: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.47589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.47659: Set connection var ansible_pipelining to False 7530 1727096029.47664: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096029.47671: Set connection var ansible_timeout to 10 7530 1727096029.47678: Set connection var ansible_shell_executable to /bin/sh 7530 1727096029.47681: Set connection var ansible_shell_type to sh 7530 1727096029.47685: Set connection var ansible_connection to ssh 7530 1727096029.47706: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.47710: variable 'ansible_connection' from source: unknown 7530 1727096029.47713: variable 'ansible_module_compression' from source: unknown 7530 1727096029.47715: variable 'ansible_shell_type' from source: unknown 7530 1727096029.47718: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.47720: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.47722: variable 'ansible_pipelining' from source: unknown 7530 1727096029.47725: variable 'ansible_timeout' from source: unknown 7530 1727096029.47726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.47828: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096029.47839: variable 'omit' from source: magic vars 7530 1727096029.47843: starting attempt loop 7530 1727096029.47845: running the handler 7530 1727096029.47860: _low_level_execute_command(): starting 7530 1727096029.47866: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096029.48365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096029.48394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096029.48399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096029.48452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096029.48456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.48458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.48507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.50156: stdout chunk (state=3): >>>/root <<< 7530 1727096029.50250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.50281: stderr chunk (state=3): >>><<< 7530 1727096029.50284: stdout chunk (state=3): >>><<< 7530 1727096029.50305: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096029.50323: _low_level_execute_command(): starting 7530 1727096029.50330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962 `" && echo ansible-tmp-1727096029.5030725-8337-213482036984962="` echo /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962 `" ) && sleep 0' 7530 1727096029.50771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096029.50775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096029.50777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096029.50787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096029.50790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096029.50842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096029.50847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.50850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.50879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.52803: stdout chunk (state=3): >>>ansible-tmp-1727096029.5030725-8337-213482036984962=/root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962 <<< 7530 1727096029.52905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.52935: stderr chunk (state=3): >>><<< 7530 1727096029.52938: stdout chunk (state=3): >>><<< 7530 1727096029.52961: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096029.5030725-8337-213482036984962=/root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096029.52990: variable 'ansible_module_compression' from source: unknown 7530 1727096029.53037: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096029.53072: variable 'ansible_facts' from source: unknown 7530 1727096029.53129: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/AnsiballZ_command.py 7530 1727096029.53232: Sending initial data 7530 1727096029.53235: Sent initial data (154 bytes) 7530 1727096029.53915: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.53919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.55507: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7530 1727096029.55549: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096029.55620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096029.55625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp62lk1swh /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/AnsiballZ_command.py <<< 7530 1727096029.55652: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/AnsiballZ_command.py" <<< 7530 1727096029.55696: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp62lk1swh" to remote "/root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/AnsiballZ_command.py" <<< 7530 1727096029.56614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.56620: stdout chunk (state=3): >>><<< 7530 1727096029.56627: stderr chunk (state=3): >>><<< 7530 1727096029.56631: done transferring module to remote 7530 1727096029.56634: _low_level_execute_command(): starting 7530 1727096029.56638: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/ /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/AnsiballZ_command.py && sleep 0' 7530 1727096029.57222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096029.57244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096029.57261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096029.57373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096029.57396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.57423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.57525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.59356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.59383: stderr chunk (state=3): >>><<< 7530 1727096029.59387: stdout chunk (state=3): >>><<< 7530 1727096029.59402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096029.59405: _low_level_execute_command(): starting 7530 1727096029.59410: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/AnsiballZ_command.py && sleep 0' 7530 1727096029.59869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096029.59874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096029.59877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096029.59879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096029.59935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096029.59942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.59944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.59984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.76356: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-23 08:53:49.756984", "end": "2024-09-23 08:53:49.761184", "delta": "0:00:00.004200", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096029.78180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096029.78184: stdout chunk (state=3): >>><<< 7530 1727096029.78186: stderr chunk (state=3): >>><<< 7530 1727096029.78189: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-23 08:53:49.756984", "end": "2024-09-23 08:53:49.761184", "delta": "0:00:00.004200", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096029.78196: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096029.78211: _low_level_execute_command(): starting 7530 1727096029.78220: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096029.5030725-8337-213482036984962/ > /dev/null 2>&1 && sleep 0' 7530 1727096029.78918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096029.78938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096029.79060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096029.79090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.79161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.79179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.81085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.81107: stdout chunk (state=3): >>><<< 7530 1727096029.81123: stderr chunk (state=3): >>><<< 7530 1727096029.81274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096029.81277: handler run complete 7530 1727096029.81279: Evaluated conditional (False): False 7530 1727096029.81282: attempt loop complete, returning result 7530 1727096029.81284: _execute() done 7530 1727096029.81286: dumping result to json 7530 1727096029.81288: done dumping result, returning 7530 1727096029.81290: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [0afff68d-5257-086b-f4f0-00000000005d] 7530 1727096029.81292: sending task result for task 0afff68d-5257-086b-f4f0-00000000005d 7530 1727096029.81361: done sending task result for task 0afff68d-5257-086b-f4f0-00000000005d 7530 1727096029.81364: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.004200", "end": "2024-09-23 08:53:49.761184", "rc": 0, "start": "2024-09-23 08:53:49.756984" } STDOUT: default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 default via 203.0.113.1 dev veth0 proto static metric 65535 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 7530 1727096029.81442: no more pending results, returning what we have 7530 1727096029.81445: results queue empty 7530 1727096029.81446: checking for any_errors_fatal 7530 1727096029.81452: done checking for any_errors_fatal 7530 1727096029.81452: checking for max_fail_percentage 7530 1727096029.81454: done checking for max_fail_percentage 7530 1727096029.81455: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.81456: done checking to see if all hosts have failed 7530 1727096029.81457: getting the remaining hosts for this loop 7530 1727096029.81458: done getting the remaining hosts for this loop 7530 1727096029.81462: getting the next task for host managed_node3 7530 1727096029.81474: done getting next task for host managed_node3 7530 1727096029.81477: ^ task is: TASK: Assert default ipv4 route is present 7530 1727096029.81479: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.81483: getting variables 7530 1727096029.81485: in VariableManager get_vars() 7530 1727096029.81539: Calling all_inventory to load vars for managed_node3 7530 1727096029.81542: Calling groups_inventory to load vars for managed_node3 7530 1727096029.81545: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.81557: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.81560: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.81563: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.83384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.84980: done with get_vars() 7530 1727096029.85008: done getting variables 7530 1727096029.85074: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is present] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:52 Monday 23 September 2024 08:53:49 -0400 (0:00:00.386) 0:00:20.639 ****** 7530 1727096029.85101: entering _queue_task() for managed_node3/assert 7530 1727096029.85430: worker is 1 (out of 1 available) 7530 1727096029.85442: exiting _queue_task() for managed_node3/assert 7530 1727096029.85454: done queuing things up, now waiting for results queue to drain 7530 1727096029.85455: waiting for pending results... 7530 1727096029.85823: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present 7530 1727096029.85920: in run() - task 0afff68d-5257-086b-f4f0-00000000005e 7530 1727096029.85925: variable 'ansible_search_path' from source: unknown 7530 1727096029.85946: calling self._execute() 7530 1727096029.86060: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.86077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.86092: variable 'omit' from source: magic vars 7530 1727096029.86575: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.86579: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.86582: variable 'omit' from source: magic vars 7530 1727096029.86584: variable 'omit' from source: magic vars 7530 1727096029.86590: variable 'omit' from source: magic vars 7530 1727096029.86636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096029.86684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096029.86710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096029.86733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.86751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.86794: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096029.86804: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.86812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.86923: Set connection var ansible_pipelining to False 7530 1727096029.86937: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096029.87009: Set connection var ansible_timeout to 10 7530 1727096029.87012: Set connection var ansible_shell_executable to /bin/sh 7530 1727096029.87015: Set connection var ansible_shell_type to sh 7530 1727096029.87018: Set connection var ansible_connection to ssh 7530 1727096029.87021: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.87023: variable 'ansible_connection' from source: unknown 7530 1727096029.87025: variable 'ansible_module_compression' from source: unknown 7530 1727096029.87028: variable 'ansible_shell_type' from source: unknown 7530 1727096029.87030: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.87032: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.87040: variable 'ansible_pipelining' from source: unknown 7530 1727096029.87048: variable 'ansible_timeout' from source: unknown 7530 1727096029.87056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.87206: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096029.87231: variable 'omit' from source: magic vars 7530 1727096029.87242: starting attempt loop 7530 1727096029.87250: running the handler 7530 1727096029.87445: variable '__test_str' from source: task vars 7530 1727096029.87498: variable 'interface' from source: play vars 7530 1727096029.87514: variable 'ipv4_routes' from source: set_fact 7530 1727096029.87531: Evaluated conditional (__test_str in ipv4_routes.stdout): True 7530 1727096029.87541: handler run complete 7530 1727096029.87663: attempt loop complete, returning result 7530 1727096029.87667: _execute() done 7530 1727096029.87671: dumping result to json 7530 1727096029.87673: done dumping result, returning 7530 1727096029.87676: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present [0afff68d-5257-086b-f4f0-00000000005e] 7530 1727096029.87678: sending task result for task 0afff68d-5257-086b-f4f0-00000000005e 7530 1727096029.87748: done sending task result for task 0afff68d-5257-086b-f4f0-00000000005e 7530 1727096029.87752: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096029.87820: no more pending results, returning what we have 7530 1727096029.87824: results queue empty 7530 1727096029.87825: checking for any_errors_fatal 7530 1727096029.87833: done checking for any_errors_fatal 7530 1727096029.87834: checking for max_fail_percentage 7530 1727096029.87836: done checking for max_fail_percentage 7530 1727096029.87837: checking to see if all hosts have failed and the running result is not ok 7530 1727096029.87838: done checking to see if all hosts have failed 7530 1727096029.87839: getting the remaining hosts for this loop 7530 1727096029.87840: done getting the remaining hosts for this loop 7530 1727096029.87844: getting the next task for host managed_node3 7530 1727096029.87851: done getting next task for host managed_node3 7530 1727096029.87854: ^ task is: TASK: Get ipv6 routes 7530 1727096029.87857: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096029.87861: getting variables 7530 1727096029.87863: in VariableManager get_vars() 7530 1727096029.88024: Calling all_inventory to load vars for managed_node3 7530 1727096029.88027: Calling groups_inventory to load vars for managed_node3 7530 1727096029.88030: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096029.88042: Calling all_plugins_play to load vars for managed_node3 7530 1727096029.88046: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096029.88049: Calling groups_plugins_play to load vars for managed_node3 7530 1727096029.89560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096029.91360: done with get_vars() 7530 1727096029.91390: done getting variables 7530 1727096029.91458: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:57 Monday 23 September 2024 08:53:49 -0400 (0:00:00.063) 0:00:20.703 ****** 7530 1727096029.91489: entering _queue_task() for managed_node3/command 7530 1727096029.91903: worker is 1 (out of 1 available) 7530 1727096029.91916: exiting _queue_task() for managed_node3/command 7530 1727096029.91929: done queuing things up, now waiting for results queue to drain 7530 1727096029.91931: waiting for pending results... 7530 1727096029.92214: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7530 1727096029.92375: in run() - task 0afff68d-5257-086b-f4f0-00000000005f 7530 1727096029.92381: variable 'ansible_search_path' from source: unknown 7530 1727096029.92388: calling self._execute() 7530 1727096029.92529: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.92543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.92561: variable 'omit' from source: magic vars 7530 1727096029.92992: variable 'ansible_distribution_major_version' from source: facts 7530 1727096029.93014: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096029.93274: variable 'omit' from source: magic vars 7530 1727096029.93278: variable 'omit' from source: magic vars 7530 1727096029.93281: variable 'omit' from source: magic vars 7530 1727096029.93284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096029.93287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096029.93289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096029.93291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.93293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096029.93295: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096029.93298: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.93300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.93403: Set connection var ansible_pipelining to False 7530 1727096029.93424: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096029.93436: Set connection var ansible_timeout to 10 7530 1727096029.93451: Set connection var ansible_shell_executable to /bin/sh 7530 1727096029.93458: Set connection var ansible_shell_type to sh 7530 1727096029.93465: Set connection var ansible_connection to ssh 7530 1727096029.93500: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.93509: variable 'ansible_connection' from source: unknown 7530 1727096029.93517: variable 'ansible_module_compression' from source: unknown 7530 1727096029.93535: variable 'ansible_shell_type' from source: unknown 7530 1727096029.93543: variable 'ansible_shell_executable' from source: unknown 7530 1727096029.93550: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096029.93559: variable 'ansible_pipelining' from source: unknown 7530 1727096029.93566: variable 'ansible_timeout' from source: unknown 7530 1727096029.93576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096029.93726: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096029.93754: variable 'omit' from source: magic vars 7530 1727096029.93764: starting attempt loop 7530 1727096029.93855: running the handler 7530 1727096029.93858: _low_level_execute_command(): starting 7530 1727096029.93861: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096029.94598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096029.94632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096029.94745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.94777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.94855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.96589: stdout chunk (state=3): >>>/root <<< 7530 1727096029.96748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.96752: stdout chunk (state=3): >>><<< 7530 1727096029.96755: stderr chunk (state=3): >>><<< 7530 1727096029.96787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096029.96808: _low_level_execute_command(): starting 7530 1727096029.96819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290 `" && echo ansible-tmp-1727096029.9679441-8355-174813372041290="` echo /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290 `" ) && sleep 0' 7530 1727096029.97534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096029.97573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096029.97576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096029.97580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096029.97582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096029.97584: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096029.97593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096029.97714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096029.97717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096029.97721: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096029.97738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096029.97810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096029.99832: stdout chunk (state=3): >>>ansible-tmp-1727096029.9679441-8355-174813372041290=/root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290 <<< 7530 1727096029.99984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096029.99996: stdout chunk (state=3): >>><<< 7530 1727096030.00010: stderr chunk (state=3): >>><<< 7530 1727096030.00040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096029.9679441-8355-174813372041290=/root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096030.00092: variable 'ansible_module_compression' from source: unknown 7530 1727096030.00164: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096030.00211: variable 'ansible_facts' from source: unknown 7530 1727096030.00315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/AnsiballZ_command.py 7530 1727096030.00604: Sending initial data 7530 1727096030.00607: Sent initial data (154 bytes) 7530 1727096030.01309: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096030.01401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.01413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.03093: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096030.03131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096030.03170: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5g4i9qnl /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/AnsiballZ_command.py <<< 7530 1727096030.03174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/AnsiballZ_command.py" <<< 7530 1727096030.03239: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5g4i9qnl" to remote "/root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/AnsiballZ_command.py" <<< 7530 1727096030.04074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.04132: stdout chunk (state=3): >>><<< 7530 1727096030.04135: stderr chunk (state=3): >>><<< 7530 1727096030.04139: done transferring module to remote 7530 1727096030.04155: _low_level_execute_command(): starting 7530 1727096030.04164: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/ /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/AnsiballZ_command.py && sleep 0' 7530 1727096030.04893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.04939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096030.04957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096030.04978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.05056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.06970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.06997: stdout chunk (state=3): >>><<< 7530 1727096030.07003: stderr chunk (state=3): >>><<< 7530 1727096030.07110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096030.07114: _low_level_execute_command(): starting 7530 1727096030.07117: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/AnsiballZ_command.py && sleep 0' 7530 1727096030.07705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096030.07721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096030.07738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.07789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.07866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096030.07889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096030.07919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.07999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.24609: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 08:53:50.239553", "end": "2024-09-23 08:53:50.243296", "delta": "0:00:00.003743", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096030.26222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.26285: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096030.26373: stdout chunk (state=3): >>><<< 7530 1727096030.26376: stderr chunk (state=3): >>><<< 7530 1727096030.26379: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 08:53:50.239553", "end": "2024-09-23 08:53:50.243296", "delta": "0:00:00.003743", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096030.26382: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096030.26384: _low_level_execute_command(): starting 7530 1727096030.26392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096029.9679441-8355-174813372041290/ > /dev/null 2>&1 && sleep 0' 7530 1727096030.26993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096030.27011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096030.27026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.27046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096030.27071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096030.27087: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096030.27102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.27122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096030.27134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096030.27146: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096030.27158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096030.27190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096030.27272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096030.27325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.27392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.29315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.29328: stdout chunk (state=3): >>><<< 7530 1727096030.29340: stderr chunk (state=3): >>><<< 7530 1727096030.29361: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096030.29377: handler run complete 7530 1727096030.29407: Evaluated conditional (False): False 7530 1727096030.29423: attempt loop complete, returning result 7530 1727096030.29431: _execute() done 7530 1727096030.29436: dumping result to json 7530 1727096030.29584: done dumping result, returning 7530 1727096030.29588: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [0afff68d-5257-086b-f4f0-00000000005f] 7530 1727096030.29590: sending task result for task 0afff68d-5257-086b-f4f0-00000000005f 7530 1727096030.29673: done sending task result for task 0afff68d-5257-086b-f4f0-00000000005f 7530 1727096030.29677: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003743", "end": "2024-09-23 08:53:50.243296", "rc": 0, "start": "2024-09-23 08:53:50.239553" } STDOUT: 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 7530 1727096030.29763: no more pending results, returning what we have 7530 1727096030.29769: results queue empty 7530 1727096030.29770: checking for any_errors_fatal 7530 1727096030.29776: done checking for any_errors_fatal 7530 1727096030.29777: checking for max_fail_percentage 7530 1727096030.29779: done checking for max_fail_percentage 7530 1727096030.29779: checking to see if all hosts have failed and the running result is not ok 7530 1727096030.29780: done checking to see if all hosts have failed 7530 1727096030.29781: getting the remaining hosts for this loop 7530 1727096030.29782: done getting the remaining hosts for this loop 7530 1727096030.29785: getting the next task for host managed_node3 7530 1727096030.29791: done getting next task for host managed_node3 7530 1727096030.29941: ^ task is: TASK: Assert default ipv6 route is present 7530 1727096030.29944: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096030.29948: getting variables 7530 1727096030.29950: in VariableManager get_vars() 7530 1727096030.29997: Calling all_inventory to load vars for managed_node3 7530 1727096030.30000: Calling groups_inventory to load vars for managed_node3 7530 1727096030.30003: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.30013: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.30016: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.30019: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.31415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.32821: done with get_vars() 7530 1727096030.32849: done getting variables 7530 1727096030.32913: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is present] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:61 Monday 23 September 2024 08:53:50 -0400 (0:00:00.414) 0:00:21.117 ****** 7530 1727096030.32940: entering _queue_task() for managed_node3/assert 7530 1727096030.33278: worker is 1 (out of 1 available) 7530 1727096030.33292: exiting _queue_task() for managed_node3/assert 7530 1727096030.33302: done queuing things up, now waiting for results queue to drain 7530 1727096030.33304: waiting for pending results... 7530 1727096030.33602: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present 7530 1727096030.33701: in run() - task 0afff68d-5257-086b-f4f0-000000000060 7530 1727096030.33793: variable 'ansible_search_path' from source: unknown 7530 1727096030.33797: calling self._execute() 7530 1727096030.33880: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.33892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.33910: variable 'omit' from source: magic vars 7530 1727096030.34307: variable 'ansible_distribution_major_version' from source: facts 7530 1727096030.34326: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096030.34464: variable 'network_provider' from source: set_fact 7530 1727096030.34553: Evaluated conditional (network_provider == "nm"): True 7530 1727096030.34557: variable 'omit' from source: magic vars 7530 1727096030.34559: variable 'omit' from source: magic vars 7530 1727096030.34562: variable 'omit' from source: magic vars 7530 1727096030.34604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096030.34647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096030.34680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096030.34705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096030.34725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096030.34762: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096030.34780: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.34789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.34904: Set connection var ansible_pipelining to False 7530 1727096030.35073: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096030.35076: Set connection var ansible_timeout to 10 7530 1727096030.35078: Set connection var ansible_shell_executable to /bin/sh 7530 1727096030.35081: Set connection var ansible_shell_type to sh 7530 1727096030.35083: Set connection var ansible_connection to ssh 7530 1727096030.35085: variable 'ansible_shell_executable' from source: unknown 7530 1727096030.35087: variable 'ansible_connection' from source: unknown 7530 1727096030.35089: variable 'ansible_module_compression' from source: unknown 7530 1727096030.35091: variable 'ansible_shell_type' from source: unknown 7530 1727096030.35093: variable 'ansible_shell_executable' from source: unknown 7530 1727096030.35095: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.35097: variable 'ansible_pipelining' from source: unknown 7530 1727096030.35099: variable 'ansible_timeout' from source: unknown 7530 1727096030.35101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.35175: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096030.35195: variable 'omit' from source: magic vars 7530 1727096030.35205: starting attempt loop 7530 1727096030.35217: running the handler 7530 1727096030.35382: variable '__test_str' from source: task vars 7530 1727096030.35461: variable 'interface' from source: play vars 7530 1727096030.35478: variable 'ipv6_route' from source: set_fact 7530 1727096030.35495: Evaluated conditional (__test_str in ipv6_route.stdout): True 7530 1727096030.35505: handler run complete 7530 1727096030.35524: attempt loop complete, returning result 7530 1727096030.35531: _execute() done 7530 1727096030.35542: dumping result to json 7530 1727096030.35551: done dumping result, returning 7530 1727096030.35650: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present [0afff68d-5257-086b-f4f0-000000000060] 7530 1727096030.35653: sending task result for task 0afff68d-5257-086b-f4f0-000000000060 7530 1727096030.35723: done sending task result for task 0afff68d-5257-086b-f4f0-000000000060 7530 1727096030.35726: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096030.35805: no more pending results, returning what we have 7530 1727096030.35809: results queue empty 7530 1727096030.35809: checking for any_errors_fatal 7530 1727096030.35819: done checking for any_errors_fatal 7530 1727096030.35820: checking for max_fail_percentage 7530 1727096030.35821: done checking for max_fail_percentage 7530 1727096030.35822: checking to see if all hosts have failed and the running result is not ok 7530 1727096030.35823: done checking to see if all hosts have failed 7530 1727096030.35824: getting the remaining hosts for this loop 7530 1727096030.35825: done getting the remaining hosts for this loop 7530 1727096030.35829: getting the next task for host managed_node3 7530 1727096030.35837: done getting next task for host managed_node3 7530 1727096030.35840: ^ task is: TASK: TEARDOWN: remove profiles. 7530 1727096030.35842: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096030.35846: getting variables 7530 1727096030.35848: in VariableManager get_vars() 7530 1727096030.35903: Calling all_inventory to load vars for managed_node3 7530 1727096030.35906: Calling groups_inventory to load vars for managed_node3 7530 1727096030.35909: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.35922: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.35925: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.35929: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.37658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.39116: done with get_vars() 7530 1727096030.39143: done getting variables 7530 1727096030.39201: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:67 Monday 23 September 2024 08:53:50 -0400 (0:00:00.062) 0:00:21.180 ****** 7530 1727096030.39230: entering _queue_task() for managed_node3/debug 7530 1727096030.39562: worker is 1 (out of 1 available) 7530 1727096030.39578: exiting _queue_task() for managed_node3/debug 7530 1727096030.39589: done queuing things up, now waiting for results queue to drain 7530 1727096030.39590: waiting for pending results... 7530 1727096030.39987: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7530 1727096030.39994: in run() - task 0afff68d-5257-086b-f4f0-000000000061 7530 1727096030.40013: variable 'ansible_search_path' from source: unknown 7530 1727096030.40055: calling self._execute() 7530 1727096030.40166: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.40181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.40203: variable 'omit' from source: magic vars 7530 1727096030.40626: variable 'ansible_distribution_major_version' from source: facts 7530 1727096030.40645: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096030.40656: variable 'omit' from source: magic vars 7530 1727096030.40685: variable 'omit' from source: magic vars 7530 1727096030.40730: variable 'omit' from source: magic vars 7530 1727096030.40846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096030.40850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096030.40854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096030.40879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096030.40895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096030.40931: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096030.40940: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.40947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.41065: Set connection var ansible_pipelining to False 7530 1727096030.41080: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096030.41090: Set connection var ansible_timeout to 10 7530 1727096030.41103: Set connection var ansible_shell_executable to /bin/sh 7530 1727096030.41109: Set connection var ansible_shell_type to sh 7530 1727096030.41173: Set connection var ansible_connection to ssh 7530 1727096030.41176: variable 'ansible_shell_executable' from source: unknown 7530 1727096030.41178: variable 'ansible_connection' from source: unknown 7530 1727096030.41180: variable 'ansible_module_compression' from source: unknown 7530 1727096030.41182: variable 'ansible_shell_type' from source: unknown 7530 1727096030.41184: variable 'ansible_shell_executable' from source: unknown 7530 1727096030.41185: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.41187: variable 'ansible_pipelining' from source: unknown 7530 1727096030.41189: variable 'ansible_timeout' from source: unknown 7530 1727096030.41190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.41319: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096030.41337: variable 'omit' from source: magic vars 7530 1727096030.41346: starting attempt loop 7530 1727096030.41353: running the handler 7530 1727096030.41410: handler run complete 7530 1727096030.41432: attempt loop complete, returning result 7530 1727096030.41439: _execute() done 7530 1727096030.41445: dumping result to json 7530 1727096030.41498: done dumping result, returning 7530 1727096030.41501: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [0afff68d-5257-086b-f4f0-000000000061] 7530 1727096030.41503: sending task result for task 0afff68d-5257-086b-f4f0-000000000061 7530 1727096030.41573: done sending task result for task 0afff68d-5257-086b-f4f0-000000000061 7530 1727096030.41576: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7530 1727096030.41650: no more pending results, returning what we have 7530 1727096030.41653: results queue empty 7530 1727096030.41654: checking for any_errors_fatal 7530 1727096030.41660: done checking for any_errors_fatal 7530 1727096030.41661: checking for max_fail_percentage 7530 1727096030.41663: done checking for max_fail_percentage 7530 1727096030.41663: checking to see if all hosts have failed and the running result is not ok 7530 1727096030.41665: done checking to see if all hosts have failed 7530 1727096030.41665: getting the remaining hosts for this loop 7530 1727096030.41669: done getting the remaining hosts for this loop 7530 1727096030.41673: getting the next task for host managed_node3 7530 1727096030.41681: done getting next task for host managed_node3 7530 1727096030.41686: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096030.41689: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096030.41708: getting variables 7530 1727096030.41711: in VariableManager get_vars() 7530 1727096030.41764: Calling all_inventory to load vars for managed_node3 7530 1727096030.41870: Calling groups_inventory to load vars for managed_node3 7530 1727096030.41875: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.41887: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.41891: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.41894: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.43395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.45018: done with get_vars() 7530 1727096030.45047: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:53:50 -0400 (0:00:00.059) 0:00:21.239 ****** 7530 1727096030.45146: entering _queue_task() for managed_node3/include_tasks 7530 1727096030.45476: worker is 1 (out of 1 available) 7530 1727096030.45490: exiting _queue_task() for managed_node3/include_tasks 7530 1727096030.45503: done queuing things up, now waiting for results queue to drain 7530 1727096030.45505: waiting for pending results... 7530 1727096030.45888: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096030.45931: in run() - task 0afff68d-5257-086b-f4f0-000000000069 7530 1727096030.45953: variable 'ansible_search_path' from source: unknown 7530 1727096030.45961: variable 'ansible_search_path' from source: unknown 7530 1727096030.46005: calling self._execute() 7530 1727096030.46110: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.46121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.46134: variable 'omit' from source: magic vars 7530 1727096030.46512: variable 'ansible_distribution_major_version' from source: facts 7530 1727096030.46534: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096030.46545: _execute() done 7530 1727096030.46552: dumping result to json 7530 1727096030.46558: done dumping result, returning 7530 1727096030.46573: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-086b-f4f0-000000000069] 7530 1727096030.46582: sending task result for task 0afff68d-5257-086b-f4f0-000000000069 7530 1727096030.46824: done sending task result for task 0afff68d-5257-086b-f4f0-000000000069 7530 1727096030.46827: WORKER PROCESS EXITING 7530 1727096030.46881: no more pending results, returning what we have 7530 1727096030.46887: in VariableManager get_vars() 7530 1727096030.46954: Calling all_inventory to load vars for managed_node3 7530 1727096030.46957: Calling groups_inventory to load vars for managed_node3 7530 1727096030.46960: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.46977: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.46980: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.46983: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.48626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.50132: done with get_vars() 7530 1727096030.50158: variable 'ansible_search_path' from source: unknown 7530 1727096030.50160: variable 'ansible_search_path' from source: unknown 7530 1727096030.50207: we have included files to process 7530 1727096030.50209: generating all_blocks data 7530 1727096030.50211: done generating all_blocks data 7530 1727096030.50216: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096030.50217: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096030.50220: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096030.50786: done processing included file 7530 1727096030.50788: iterating over new_blocks loaded from include file 7530 1727096030.50790: in VariableManager get_vars() 7530 1727096030.50822: done with get_vars() 7530 1727096030.50824: filtering new block on tags 7530 1727096030.50842: done filtering new block on tags 7530 1727096030.50845: in VariableManager get_vars() 7530 1727096030.50876: done with get_vars() 7530 1727096030.50878: filtering new block on tags 7530 1727096030.50899: done filtering new block on tags 7530 1727096030.50902: in VariableManager get_vars() 7530 1727096030.50932: done with get_vars() 7530 1727096030.50934: filtering new block on tags 7530 1727096030.50951: done filtering new block on tags 7530 1727096030.50953: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7530 1727096030.50960: extending task lists for all hosts with included blocks 7530 1727096030.51780: done extending task lists 7530 1727096030.51782: done processing included files 7530 1727096030.51783: results queue empty 7530 1727096030.51784: checking for any_errors_fatal 7530 1727096030.51787: done checking for any_errors_fatal 7530 1727096030.51788: checking for max_fail_percentage 7530 1727096030.51789: done checking for max_fail_percentage 7530 1727096030.51790: checking to see if all hosts have failed and the running result is not ok 7530 1727096030.51791: done checking to see if all hosts have failed 7530 1727096030.51792: getting the remaining hosts for this loop 7530 1727096030.51793: done getting the remaining hosts for this loop 7530 1727096030.51796: getting the next task for host managed_node3 7530 1727096030.51801: done getting next task for host managed_node3 7530 1727096030.51804: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096030.51806: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096030.51817: getting variables 7530 1727096030.51818: in VariableManager get_vars() 7530 1727096030.51840: Calling all_inventory to load vars for managed_node3 7530 1727096030.51842: Calling groups_inventory to load vars for managed_node3 7530 1727096030.51845: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.51851: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.51854: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.51857: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.53026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.54625: done with get_vars() 7530 1727096030.54649: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:53:50 -0400 (0:00:00.095) 0:00:21.335 ****** 7530 1727096030.54735: entering _queue_task() for managed_node3/setup 7530 1727096030.55085: worker is 1 (out of 1 available) 7530 1727096030.55098: exiting _queue_task() for managed_node3/setup 7530 1727096030.55111: done queuing things up, now waiting for results queue to drain 7530 1727096030.55112: waiting for pending results... 7530 1727096030.55489: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096030.55597: in run() - task 0afff68d-5257-086b-f4f0-000000000d46 7530 1727096030.55621: variable 'ansible_search_path' from source: unknown 7530 1727096030.55630: variable 'ansible_search_path' from source: unknown 7530 1727096030.55674: calling self._execute() 7530 1727096030.55826: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.55829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.55832: variable 'omit' from source: magic vars 7530 1727096030.56188: variable 'ansible_distribution_major_version' from source: facts 7530 1727096030.56208: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096030.56441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096030.59677: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096030.59721: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096030.59817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096030.59929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096030.59961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096030.60178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096030.60223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096030.60255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096030.60372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096030.60394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096030.60496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096030.60629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096030.60666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096030.60712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096030.60732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096030.61137: variable '__network_required_facts' from source: role '' defaults 7530 1727096030.61141: variable 'ansible_facts' from source: unknown 7530 1727096030.61897: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7530 1727096030.61909: when evaluation is False, skipping this task 7530 1727096030.61918: _execute() done 7530 1727096030.61926: dumping result to json 7530 1727096030.61935: done dumping result, returning 7530 1727096030.61948: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-086b-f4f0-000000000d46] 7530 1727096030.61964: sending task result for task 0afff68d-5257-086b-f4f0-000000000d46 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096030.62221: no more pending results, returning what we have 7530 1727096030.62226: results queue empty 7530 1727096030.62227: checking for any_errors_fatal 7530 1727096030.62228: done checking for any_errors_fatal 7530 1727096030.62229: checking for max_fail_percentage 7530 1727096030.62231: done checking for max_fail_percentage 7530 1727096030.62232: checking to see if all hosts have failed and the running result is not ok 7530 1727096030.62233: done checking to see if all hosts have failed 7530 1727096030.62234: getting the remaining hosts for this loop 7530 1727096030.62235: done getting the remaining hosts for this loop 7530 1727096030.62239: getting the next task for host managed_node3 7530 1727096030.62248: done getting next task for host managed_node3 7530 1727096030.62252: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096030.62256: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096030.62278: getting variables 7530 1727096030.62280: in VariableManager get_vars() 7530 1727096030.62335: Calling all_inventory to load vars for managed_node3 7530 1727096030.62338: Calling groups_inventory to load vars for managed_node3 7530 1727096030.62341: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.62353: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.62357: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.62360: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.63183: done sending task result for task 0afff68d-5257-086b-f4f0-000000000d46 7530 1727096030.63187: WORKER PROCESS EXITING 7530 1727096030.64013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.67042: done with get_vars() 7530 1727096030.67077: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:53:50 -0400 (0:00:00.124) 0:00:21.460 ****** 7530 1727096030.67185: entering _queue_task() for managed_node3/stat 7530 1727096030.67527: worker is 1 (out of 1 available) 7530 1727096030.67541: exiting _queue_task() for managed_node3/stat 7530 1727096030.67553: done queuing things up, now waiting for results queue to drain 7530 1727096030.67555: waiting for pending results... 7530 1727096030.67822: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096030.67986: in run() - task 0afff68d-5257-086b-f4f0-000000000d48 7530 1727096030.68011: variable 'ansible_search_path' from source: unknown 7530 1727096030.68018: variable 'ansible_search_path' from source: unknown 7530 1727096030.68060: calling self._execute() 7530 1727096030.68172: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.68185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.68200: variable 'omit' from source: magic vars 7530 1727096030.68555: variable 'ansible_distribution_major_version' from source: facts 7530 1727096030.68566: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096030.68689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096030.68890: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096030.68927: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096030.68948: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096030.68977: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096030.69043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096030.69061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096030.69085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096030.69101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096030.69170: variable '__network_is_ostree' from source: set_fact 7530 1727096030.69177: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096030.69180: when evaluation is False, skipping this task 7530 1727096030.69183: _execute() done 7530 1727096030.69187: dumping result to json 7530 1727096030.69189: done dumping result, returning 7530 1727096030.69198: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-086b-f4f0-000000000d48] 7530 1727096030.69203: sending task result for task 0afff68d-5257-086b-f4f0-000000000d48 7530 1727096030.69290: done sending task result for task 0afff68d-5257-086b-f4f0-000000000d48 7530 1727096030.69293: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096030.69349: no more pending results, returning what we have 7530 1727096030.69353: results queue empty 7530 1727096030.69354: checking for any_errors_fatal 7530 1727096030.69360: done checking for any_errors_fatal 7530 1727096030.69360: checking for max_fail_percentage 7530 1727096030.69362: done checking for max_fail_percentage 7530 1727096030.69363: checking to see if all hosts have failed and the running result is not ok 7530 1727096030.69364: done checking to see if all hosts have failed 7530 1727096030.69365: getting the remaining hosts for this loop 7530 1727096030.69366: done getting the remaining hosts for this loop 7530 1727096030.69371: getting the next task for host managed_node3 7530 1727096030.69379: done getting next task for host managed_node3 7530 1727096030.69382: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096030.69387: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096030.69407: getting variables 7530 1727096030.69409: in VariableManager get_vars() 7530 1727096030.69456: Calling all_inventory to load vars for managed_node3 7530 1727096030.69459: Calling groups_inventory to load vars for managed_node3 7530 1727096030.69461: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.69478: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.69481: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.69484: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.70658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.71781: done with get_vars() 7530 1727096030.71803: done getting variables 7530 1727096030.71848: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:53:50 -0400 (0:00:00.046) 0:00:21.507 ****** 7530 1727096030.71879: entering _queue_task() for managed_node3/set_fact 7530 1727096030.72130: worker is 1 (out of 1 available) 7530 1727096030.72143: exiting _queue_task() for managed_node3/set_fact 7530 1727096030.72156: done queuing things up, now waiting for results queue to drain 7530 1727096030.72158: waiting for pending results... 7530 1727096030.72350: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096030.72460: in run() - task 0afff68d-5257-086b-f4f0-000000000d49 7530 1727096030.72474: variable 'ansible_search_path' from source: unknown 7530 1727096030.72478: variable 'ansible_search_path' from source: unknown 7530 1727096030.72508: calling self._execute() 7530 1727096030.72588: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.72591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.72605: variable 'omit' from source: magic vars 7530 1727096030.72887: variable 'ansible_distribution_major_version' from source: facts 7530 1727096030.72897: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096030.73019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096030.73373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096030.73377: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096030.73380: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096030.73382: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096030.73471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096030.73503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096030.73533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096030.73561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096030.73662: variable '__network_is_ostree' from source: set_fact 7530 1727096030.73677: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096030.73687: when evaluation is False, skipping this task 7530 1727096030.73695: _execute() done 7530 1727096030.73701: dumping result to json 7530 1727096030.73709: done dumping result, returning 7530 1727096030.73721: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-086b-f4f0-000000000d49] 7530 1727096030.73731: sending task result for task 0afff68d-5257-086b-f4f0-000000000d49 7530 1727096030.74075: done sending task result for task 0afff68d-5257-086b-f4f0-000000000d49 7530 1727096030.74079: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096030.74123: no more pending results, returning what we have 7530 1727096030.74126: results queue empty 7530 1727096030.74127: checking for any_errors_fatal 7530 1727096030.74132: done checking for any_errors_fatal 7530 1727096030.74132: checking for max_fail_percentage 7530 1727096030.74134: done checking for max_fail_percentage 7530 1727096030.74135: checking to see if all hosts have failed and the running result is not ok 7530 1727096030.74136: done checking to see if all hosts have failed 7530 1727096030.74137: getting the remaining hosts for this loop 7530 1727096030.74138: done getting the remaining hosts for this loop 7530 1727096030.74142: getting the next task for host managed_node3 7530 1727096030.74151: done getting next task for host managed_node3 7530 1727096030.74155: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096030.74159: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096030.74177: getting variables 7530 1727096030.74179: in VariableManager get_vars() 7530 1727096030.74226: Calling all_inventory to load vars for managed_node3 7530 1727096030.74229: Calling groups_inventory to load vars for managed_node3 7530 1727096030.74232: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096030.74242: Calling all_plugins_play to load vars for managed_node3 7530 1727096030.74245: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096030.74247: Calling groups_plugins_play to load vars for managed_node3 7530 1727096030.75300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096030.76180: done with get_vars() 7530 1727096030.76204: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:53:50 -0400 (0:00:00.044) 0:00:21.551 ****** 7530 1727096030.76281: entering _queue_task() for managed_node3/service_facts 7530 1727096030.76595: worker is 1 (out of 1 available) 7530 1727096030.76612: exiting _queue_task() for managed_node3/service_facts 7530 1727096030.76628: done queuing things up, now waiting for results queue to drain 7530 1727096030.76630: waiting for pending results... 7530 1727096030.76849: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096030.77013: in run() - task 0afff68d-5257-086b-f4f0-000000000d4b 7530 1727096030.77034: variable 'ansible_search_path' from source: unknown 7530 1727096030.77042: variable 'ansible_search_path' from source: unknown 7530 1727096030.77090: calling self._execute() 7530 1727096030.77198: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.77217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.77238: variable 'omit' from source: magic vars 7530 1727096030.77617: variable 'ansible_distribution_major_version' from source: facts 7530 1727096030.77773: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096030.77776: variable 'omit' from source: magic vars 7530 1727096030.77779: variable 'omit' from source: magic vars 7530 1727096030.77780: variable 'omit' from source: magic vars 7530 1727096030.77795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096030.77834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096030.77860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096030.77885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096030.77901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096030.78172: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096030.78176: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.78178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.78180: Set connection var ansible_pipelining to False 7530 1727096030.78183: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096030.78184: Set connection var ansible_timeout to 10 7530 1727096030.78186: Set connection var ansible_shell_executable to /bin/sh 7530 1727096030.78188: Set connection var ansible_shell_type to sh 7530 1727096030.78190: Set connection var ansible_connection to ssh 7530 1727096030.78193: variable 'ansible_shell_executable' from source: unknown 7530 1727096030.78194: variable 'ansible_connection' from source: unknown 7530 1727096030.78197: variable 'ansible_module_compression' from source: unknown 7530 1727096030.78199: variable 'ansible_shell_type' from source: unknown 7530 1727096030.78201: variable 'ansible_shell_executable' from source: unknown 7530 1727096030.78203: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096030.78205: variable 'ansible_pipelining' from source: unknown 7530 1727096030.78207: variable 'ansible_timeout' from source: unknown 7530 1727096030.78209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096030.78381: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096030.78406: variable 'omit' from source: magic vars 7530 1727096030.78416: starting attempt loop 7530 1727096030.78422: running the handler 7530 1727096030.78444: _low_level_execute_command(): starting 7530 1727096030.78457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096030.79190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.79209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.79254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096030.79279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.79315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.81060: stdout chunk (state=3): >>>/root <<< 7530 1727096030.81212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.81216: stdout chunk (state=3): >>><<< 7530 1727096030.81218: stderr chunk (state=3): >>><<< 7530 1727096030.81343: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096030.81347: _low_level_execute_command(): starting 7530 1727096030.81350: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036 `" && echo ansible-tmp-1727096030.8124754-8387-183762089094036="` echo /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036 `" ) && sleep 0' 7530 1727096030.81839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.81856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.81883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.81915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096030.81935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.81987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.84037: stdout chunk (state=3): >>>ansible-tmp-1727096030.8124754-8387-183762089094036=/root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036 <<< 7530 1727096030.84145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.84178: stderr chunk (state=3): >>><<< 7530 1727096030.84182: stdout chunk (state=3): >>><<< 7530 1727096030.84201: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096030.8124754-8387-183762089094036=/root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096030.84245: variable 'ansible_module_compression' from source: unknown 7530 1727096030.84288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7530 1727096030.84320: variable 'ansible_facts' from source: unknown 7530 1727096030.84381: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/AnsiballZ_service_facts.py 7530 1727096030.84496: Sending initial data 7530 1727096030.84500: Sent initial data (160 bytes) 7530 1727096030.84973: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096030.84977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096030.84980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.84982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.84984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.85037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096030.85041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.85091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.86746: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096030.86776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096030.86810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpo0wjwv_e /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/AnsiballZ_service_facts.py <<< 7530 1727096030.86816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/AnsiballZ_service_facts.py" <<< 7530 1727096030.86846: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpo0wjwv_e" to remote "/root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/AnsiballZ_service_facts.py" <<< 7530 1727096030.86848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/AnsiballZ_service_facts.py" <<< 7530 1727096030.87381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.87429: stderr chunk (state=3): >>><<< 7530 1727096030.87432: stdout chunk (state=3): >>><<< 7530 1727096030.87492: done transferring module to remote 7530 1727096030.87501: _low_level_execute_command(): starting 7530 1727096030.87506: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/ /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/AnsiballZ_service_facts.py && sleep 0' 7530 1727096030.87951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.87988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096030.87991: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.87994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.87996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.88045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096030.88048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096030.88051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.88096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096030.89955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096030.89980: stderr chunk (state=3): >>><<< 7530 1727096030.89983: stdout chunk (state=3): >>><<< 7530 1727096030.89998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096030.90001: _low_level_execute_command(): starting 7530 1727096030.90010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/AnsiballZ_service_facts.py && sleep 0' 7530 1727096030.90454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.90486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.90489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096030.90491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096030.90545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096030.90548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096030.90554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096030.90599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096032.54575: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 7530 1727096032.54584: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 7530 1727096032.54598: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service<<< 7530 1727096032.54603: stdout chunk (state=3): >>>": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7530 1727096032.56258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096032.56286: stderr chunk (state=3): >>><<< 7530 1727096032.56291: stdout chunk (state=3): >>><<< 7530 1727096032.56318: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096032.56990: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096032.56997: _low_level_execute_command(): starting 7530 1727096032.57003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096030.8124754-8387-183762089094036/ > /dev/null 2>&1 && sleep 0' 7530 1727096032.57450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096032.57455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096032.57485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.57488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096032.57491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096032.57493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.57557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096032.57560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096032.57563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096032.57609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096032.59482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096032.59504: stderr chunk (state=3): >>><<< 7530 1727096032.59507: stdout chunk (state=3): >>><<< 7530 1727096032.59521: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096032.59529: handler run complete 7530 1727096032.59646: variable 'ansible_facts' from source: unknown 7530 1727096032.59751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096032.60021: variable 'ansible_facts' from source: unknown 7530 1727096032.60108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096032.60222: attempt loop complete, returning result 7530 1727096032.60225: _execute() done 7530 1727096032.60236: dumping result to json 7530 1727096032.60271: done dumping result, returning 7530 1727096032.60279: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-086b-f4f0-000000000d4b] 7530 1727096032.60283: sending task result for task 0afff68d-5257-086b-f4f0-000000000d4b 7530 1727096032.60977: done sending task result for task 0afff68d-5257-086b-f4f0-000000000d4b 7530 1727096032.60980: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096032.61039: no more pending results, returning what we have 7530 1727096032.61041: results queue empty 7530 1727096032.61041: checking for any_errors_fatal 7530 1727096032.61044: done checking for any_errors_fatal 7530 1727096032.61044: checking for max_fail_percentage 7530 1727096032.61045: done checking for max_fail_percentage 7530 1727096032.61046: checking to see if all hosts have failed and the running result is not ok 7530 1727096032.61046: done checking to see if all hosts have failed 7530 1727096032.61047: getting the remaining hosts for this loop 7530 1727096032.61047: done getting the remaining hosts for this loop 7530 1727096032.61050: getting the next task for host managed_node3 7530 1727096032.61053: done getting next task for host managed_node3 7530 1727096032.61055: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096032.61059: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096032.61065: getting variables 7530 1727096032.61066: in VariableManager get_vars() 7530 1727096032.61095: Calling all_inventory to load vars for managed_node3 7530 1727096032.61098: Calling groups_inventory to load vars for managed_node3 7530 1727096032.61100: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096032.61107: Calling all_plugins_play to load vars for managed_node3 7530 1727096032.61109: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096032.61111: Calling groups_plugins_play to load vars for managed_node3 7530 1727096032.61794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096032.62678: done with get_vars() 7530 1727096032.62700: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:53:52 -0400 (0:00:01.865) 0:00:23.416 ****** 7530 1727096032.62789: entering _queue_task() for managed_node3/package_facts 7530 1727096032.63053: worker is 1 (out of 1 available) 7530 1727096032.63070: exiting _queue_task() for managed_node3/package_facts 7530 1727096032.63081: done queuing things up, now waiting for results queue to drain 7530 1727096032.63083: waiting for pending results... 7530 1727096032.63279: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096032.63383: in run() - task 0afff68d-5257-086b-f4f0-000000000d4c 7530 1727096032.63403: variable 'ansible_search_path' from source: unknown 7530 1727096032.63407: variable 'ansible_search_path' from source: unknown 7530 1727096032.63431: calling self._execute() 7530 1727096032.63514: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096032.63522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096032.63531: variable 'omit' from source: magic vars 7530 1727096032.63821: variable 'ansible_distribution_major_version' from source: facts 7530 1727096032.63834: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096032.63838: variable 'omit' from source: magic vars 7530 1727096032.63895: variable 'omit' from source: magic vars 7530 1727096032.63918: variable 'omit' from source: magic vars 7530 1727096032.63955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096032.63991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096032.64007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096032.64021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096032.64032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096032.64056: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096032.64065: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096032.64069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096032.64140: Set connection var ansible_pipelining to False 7530 1727096032.64143: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096032.64149: Set connection var ansible_timeout to 10 7530 1727096032.64157: Set connection var ansible_shell_executable to /bin/sh 7530 1727096032.64159: Set connection var ansible_shell_type to sh 7530 1727096032.64161: Set connection var ansible_connection to ssh 7530 1727096032.64185: variable 'ansible_shell_executable' from source: unknown 7530 1727096032.64188: variable 'ansible_connection' from source: unknown 7530 1727096032.64191: variable 'ansible_module_compression' from source: unknown 7530 1727096032.64193: variable 'ansible_shell_type' from source: unknown 7530 1727096032.64195: variable 'ansible_shell_executable' from source: unknown 7530 1727096032.64197: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096032.64200: variable 'ansible_pipelining' from source: unknown 7530 1727096032.64205: variable 'ansible_timeout' from source: unknown 7530 1727096032.64209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096032.64359: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096032.64370: variable 'omit' from source: magic vars 7530 1727096032.64375: starting attempt loop 7530 1727096032.64378: running the handler 7530 1727096032.64392: _low_level_execute_command(): starting 7530 1727096032.64402: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096032.64922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096032.64926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096032.64932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096032.64936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.64990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096032.64993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096032.64995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096032.65039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096032.66757: stdout chunk (state=3): >>>/root <<< 7530 1727096032.66848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096032.66883: stderr chunk (state=3): >>><<< 7530 1727096032.66886: stdout chunk (state=3): >>><<< 7530 1727096032.66907: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096032.66920: _low_level_execute_command(): starting 7530 1727096032.66926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990 `" && echo ansible-tmp-1727096032.6690793-8424-60404147700990="` echo /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990 `" ) && sleep 0' 7530 1727096032.67395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096032.67399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.67410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096032.67412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096032.67414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.67450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096032.67463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096032.67509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096032.69515: stdout chunk (state=3): >>>ansible-tmp-1727096032.6690793-8424-60404147700990=/root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990 <<< 7530 1727096032.69613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096032.69645: stderr chunk (state=3): >>><<< 7530 1727096032.69648: stdout chunk (state=3): >>><<< 7530 1727096032.69662: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096032.6690793-8424-60404147700990=/root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096032.69709: variable 'ansible_module_compression' from source: unknown 7530 1727096032.69750: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7530 1727096032.69805: variable 'ansible_facts' from source: unknown 7530 1727096032.69926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/AnsiballZ_package_facts.py 7530 1727096032.70036: Sending initial data 7530 1727096032.70040: Sent initial data (159 bytes) 7530 1727096032.70499: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096032.70502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.70504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096032.70506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096032.70508: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.70562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096032.70565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096032.70575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096032.70608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096032.72284: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096032.72375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096032.72379: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpwuuj53y8 /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/AnsiballZ_package_facts.py <<< 7530 1727096032.72383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/AnsiballZ_package_facts.py" <<< 7530 1727096032.72405: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpwuuj53y8" to remote "/root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/AnsiballZ_package_facts.py" <<< 7530 1727096032.73977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096032.73982: stderr chunk (state=3): >>><<< 7530 1727096032.73984: stdout chunk (state=3): >>><<< 7530 1727096032.73986: done transferring module to remote 7530 1727096032.73989: _low_level_execute_command(): starting 7530 1727096032.73991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/ /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/AnsiballZ_package_facts.py && sleep 0' 7530 1727096032.74609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096032.74643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096032.74744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.74782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096032.74812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096032.74881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096032.76714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096032.76743: stderr chunk (state=3): >>><<< 7530 1727096032.76747: stdout chunk (state=3): >>><<< 7530 1727096032.76762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096032.76765: _low_level_execute_command(): starting 7530 1727096032.76772: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/AnsiballZ_package_facts.py && sleep 0' 7530 1727096032.77245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096032.77249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.77251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096032.77254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096032.77256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096032.77305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096032.77308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096032.77311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096032.77356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096033.22418: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 7530 1727096033.22430: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 7530 1727096033.22493: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 7530 1727096033.22510: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 7530 1727096033.22544: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 7530 1727096033.22638: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 7530 1727096033.22663: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7530 1727096033.24560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096033.24596: stderr chunk (state=3): >>><<< 7530 1727096033.24599: stdout chunk (state=3): >>><<< 7530 1727096033.24643: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096033.26442: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096033.26459: _low_level_execute_command(): starting 7530 1727096033.26463: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096032.6690793-8424-60404147700990/ > /dev/null 2>&1 && sleep 0' 7530 1727096033.26942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096033.26947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096033.26950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096033.26952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096033.27005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096033.27009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096033.27013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096033.27053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096033.28954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096033.28979: stderr chunk (state=3): >>><<< 7530 1727096033.28986: stdout chunk (state=3): >>><<< 7530 1727096033.29002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096033.29008: handler run complete 7530 1727096033.29476: variable 'ansible_facts' from source: unknown 7530 1727096033.29801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.30875: variable 'ansible_facts' from source: unknown 7530 1727096033.31128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.31545: attempt loop complete, returning result 7530 1727096033.31553: _execute() done 7530 1727096033.31556: dumping result to json 7530 1727096033.31678: done dumping result, returning 7530 1727096033.31687: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-086b-f4f0-000000000d4c] 7530 1727096033.31689: sending task result for task 0afff68d-5257-086b-f4f0-000000000d4c 7530 1727096033.33070: done sending task result for task 0afff68d-5257-086b-f4f0-000000000d4c 7530 1727096033.33074: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096033.33173: no more pending results, returning what we have 7530 1727096033.33175: results queue empty 7530 1727096033.33176: checking for any_errors_fatal 7530 1727096033.33179: done checking for any_errors_fatal 7530 1727096033.33180: checking for max_fail_percentage 7530 1727096033.33181: done checking for max_fail_percentage 7530 1727096033.33181: checking to see if all hosts have failed and the running result is not ok 7530 1727096033.33182: done checking to see if all hosts have failed 7530 1727096033.33182: getting the remaining hosts for this loop 7530 1727096033.33183: done getting the remaining hosts for this loop 7530 1727096033.33186: getting the next task for host managed_node3 7530 1727096033.33191: done getting next task for host managed_node3 7530 1727096033.33193: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096033.33195: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096033.33202: getting variables 7530 1727096033.33203: in VariableManager get_vars() 7530 1727096033.33237: Calling all_inventory to load vars for managed_node3 7530 1727096033.33239: Calling groups_inventory to load vars for managed_node3 7530 1727096033.33240: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096033.33247: Calling all_plugins_play to load vars for managed_node3 7530 1727096033.33249: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096033.33251: Calling groups_plugins_play to load vars for managed_node3 7530 1727096033.33952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.34828: done with get_vars() 7530 1727096033.34857: done getting variables 7530 1727096033.34905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:53:53 -0400 (0:00:00.721) 0:00:24.137 ****** 7530 1727096033.34938: entering _queue_task() for managed_node3/debug 7530 1727096033.35199: worker is 1 (out of 1 available) 7530 1727096033.35213: exiting _queue_task() for managed_node3/debug 7530 1727096033.35229: done queuing things up, now waiting for results queue to drain 7530 1727096033.35231: waiting for pending results... 7530 1727096033.35421: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096033.35516: in run() - task 0afff68d-5257-086b-f4f0-00000000006a 7530 1727096033.35532: variable 'ansible_search_path' from source: unknown 7530 1727096033.35536: variable 'ansible_search_path' from source: unknown 7530 1727096033.35564: calling self._execute() 7530 1727096033.35642: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.35646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.35656: variable 'omit' from source: magic vars 7530 1727096033.35949: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.35960: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096033.35965: variable 'omit' from source: magic vars 7530 1727096033.36011: variable 'omit' from source: magic vars 7530 1727096033.36084: variable 'network_provider' from source: set_fact 7530 1727096033.36099: variable 'omit' from source: magic vars 7530 1727096033.36137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096033.36164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096033.36182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096033.36195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096033.36206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096033.36232: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096033.36236: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.36238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.36308: Set connection var ansible_pipelining to False 7530 1727096033.36311: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096033.36317: Set connection var ansible_timeout to 10 7530 1727096033.36325: Set connection var ansible_shell_executable to /bin/sh 7530 1727096033.36332: Set connection var ansible_shell_type to sh 7530 1727096033.36334: Set connection var ansible_connection to ssh 7530 1727096033.36352: variable 'ansible_shell_executable' from source: unknown 7530 1727096033.36355: variable 'ansible_connection' from source: unknown 7530 1727096033.36358: variable 'ansible_module_compression' from source: unknown 7530 1727096033.36360: variable 'ansible_shell_type' from source: unknown 7530 1727096033.36362: variable 'ansible_shell_executable' from source: unknown 7530 1727096033.36364: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.36368: variable 'ansible_pipelining' from source: unknown 7530 1727096033.36371: variable 'ansible_timeout' from source: unknown 7530 1727096033.36376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.36491: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096033.36501: variable 'omit' from source: magic vars 7530 1727096033.36506: starting attempt loop 7530 1727096033.36509: running the handler 7530 1727096033.36546: handler run complete 7530 1727096033.36560: attempt loop complete, returning result 7530 1727096033.36564: _execute() done 7530 1727096033.36569: dumping result to json 7530 1727096033.36572: done dumping result, returning 7530 1727096033.36574: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-086b-f4f0-00000000006a] 7530 1727096033.36579: sending task result for task 0afff68d-5257-086b-f4f0-00000000006a 7530 1727096033.36664: done sending task result for task 0afff68d-5257-086b-f4f0-00000000006a 7530 1727096033.36668: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7530 1727096033.36740: no more pending results, returning what we have 7530 1727096033.36743: results queue empty 7530 1727096033.36745: checking for any_errors_fatal 7530 1727096033.36754: done checking for any_errors_fatal 7530 1727096033.36755: checking for max_fail_percentage 7530 1727096033.36756: done checking for max_fail_percentage 7530 1727096033.36757: checking to see if all hosts have failed and the running result is not ok 7530 1727096033.36758: done checking to see if all hosts have failed 7530 1727096033.36758: getting the remaining hosts for this loop 7530 1727096033.36760: done getting the remaining hosts for this loop 7530 1727096033.36763: getting the next task for host managed_node3 7530 1727096033.36773: done getting next task for host managed_node3 7530 1727096033.36778: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096033.36781: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096033.36792: getting variables 7530 1727096033.36794: in VariableManager get_vars() 7530 1727096033.36841: Calling all_inventory to load vars for managed_node3 7530 1727096033.36844: Calling groups_inventory to load vars for managed_node3 7530 1727096033.36846: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096033.36856: Calling all_plugins_play to load vars for managed_node3 7530 1727096033.36858: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096033.36861: Calling groups_plugins_play to load vars for managed_node3 7530 1727096033.37739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.39089: done with get_vars() 7530 1727096033.39132: done getting variables 7530 1727096033.39196: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:53:53 -0400 (0:00:00.042) 0:00:24.180 ****** 7530 1727096033.39227: entering _queue_task() for managed_node3/fail 7530 1727096033.39497: worker is 1 (out of 1 available) 7530 1727096033.39510: exiting _queue_task() for managed_node3/fail 7530 1727096033.39523: done queuing things up, now waiting for results queue to drain 7530 1727096033.39525: waiting for pending results... 7530 1727096033.39729: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096033.39829: in run() - task 0afff68d-5257-086b-f4f0-00000000006b 7530 1727096033.39840: variable 'ansible_search_path' from source: unknown 7530 1727096033.39844: variable 'ansible_search_path' from source: unknown 7530 1727096033.39877: calling self._execute() 7530 1727096033.39958: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.39962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.39974: variable 'omit' from source: magic vars 7530 1727096033.40271: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.40282: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096033.40373: variable 'network_state' from source: role '' defaults 7530 1727096033.40384: Evaluated conditional (network_state != {}): False 7530 1727096033.40387: when evaluation is False, skipping this task 7530 1727096033.40390: _execute() done 7530 1727096033.40394: dumping result to json 7530 1727096033.40397: done dumping result, returning 7530 1727096033.40403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-086b-f4f0-00000000006b] 7530 1727096033.40407: sending task result for task 0afff68d-5257-086b-f4f0-00000000006b 7530 1727096033.40504: done sending task result for task 0afff68d-5257-086b-f4f0-00000000006b 7530 1727096033.40509: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096033.40558: no more pending results, returning what we have 7530 1727096033.40562: results queue empty 7530 1727096033.40563: checking for any_errors_fatal 7530 1727096033.40570: done checking for any_errors_fatal 7530 1727096033.40571: checking for max_fail_percentage 7530 1727096033.40573: done checking for max_fail_percentage 7530 1727096033.40574: checking to see if all hosts have failed and the running result is not ok 7530 1727096033.40575: done checking to see if all hosts have failed 7530 1727096033.40576: getting the remaining hosts for this loop 7530 1727096033.40583: done getting the remaining hosts for this loop 7530 1727096033.40587: getting the next task for host managed_node3 7530 1727096033.40594: done getting next task for host managed_node3 7530 1727096033.40598: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096033.40602: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096033.40621: getting variables 7530 1727096033.40623: in VariableManager get_vars() 7530 1727096033.40671: Calling all_inventory to load vars for managed_node3 7530 1727096033.40674: Calling groups_inventory to load vars for managed_node3 7530 1727096033.40676: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096033.40684: Calling all_plugins_play to load vars for managed_node3 7530 1727096033.40693: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096033.40696: Calling groups_plugins_play to load vars for managed_node3 7530 1727096033.41762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.43340: done with get_vars() 7530 1727096033.43377: done getting variables 7530 1727096033.43444: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:53:53 -0400 (0:00:00.042) 0:00:24.223 ****** 7530 1727096033.43480: entering _queue_task() for managed_node3/fail 7530 1727096033.43833: worker is 1 (out of 1 available) 7530 1727096033.43845: exiting _queue_task() for managed_node3/fail 7530 1727096033.43857: done queuing things up, now waiting for results queue to drain 7530 1727096033.43859: waiting for pending results... 7530 1727096033.44498: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096033.44619: in run() - task 0afff68d-5257-086b-f4f0-00000000006c 7530 1727096033.44645: variable 'ansible_search_path' from source: unknown 7530 1727096033.44653: variable 'ansible_search_path' from source: unknown 7530 1727096033.44774: calling self._execute() 7530 1727096033.44810: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.44821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.44837: variable 'omit' from source: magic vars 7530 1727096033.45220: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.45248: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096033.45438: variable 'network_state' from source: role '' defaults 7530 1727096033.45457: Evaluated conditional (network_state != {}): False 7530 1727096033.45469: when evaluation is False, skipping this task 7530 1727096033.45477: _execute() done 7530 1727096033.45484: dumping result to json 7530 1727096033.45491: done dumping result, returning 7530 1727096033.45504: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-086b-f4f0-00000000006c] 7530 1727096033.45528: sending task result for task 0afff68d-5257-086b-f4f0-00000000006c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096033.45733: no more pending results, returning what we have 7530 1727096033.45736: results queue empty 7530 1727096033.45738: checking for any_errors_fatal 7530 1727096033.45746: done checking for any_errors_fatal 7530 1727096033.45746: checking for max_fail_percentage 7530 1727096033.45748: done checking for max_fail_percentage 7530 1727096033.45749: checking to see if all hosts have failed and the running result is not ok 7530 1727096033.45750: done checking to see if all hosts have failed 7530 1727096033.45751: getting the remaining hosts for this loop 7530 1727096033.45752: done getting the remaining hosts for this loop 7530 1727096033.45756: getting the next task for host managed_node3 7530 1727096033.45763: done getting next task for host managed_node3 7530 1727096033.45768: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096033.45772: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096033.45795: getting variables 7530 1727096033.45797: in VariableManager get_vars() 7530 1727096033.45854: Calling all_inventory to load vars for managed_node3 7530 1727096033.45857: Calling groups_inventory to load vars for managed_node3 7530 1727096033.45859: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096033.46082: Calling all_plugins_play to load vars for managed_node3 7530 1727096033.46087: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096033.46091: Calling groups_plugins_play to load vars for managed_node3 7530 1727096033.46784: done sending task result for task 0afff68d-5257-086b-f4f0-00000000006c 7530 1727096033.46788: WORKER PROCESS EXITING 7530 1727096033.57639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.59805: done with get_vars() 7530 1727096033.59951: done getting variables 7530 1727096033.60000: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:53:53 -0400 (0:00:00.165) 0:00:24.388 ****** 7530 1727096033.60031: entering _queue_task() for managed_node3/fail 7530 1727096033.60831: worker is 1 (out of 1 available) 7530 1727096033.60843: exiting _queue_task() for managed_node3/fail 7530 1727096033.60853: done queuing things up, now waiting for results queue to drain 7530 1727096033.60855: waiting for pending results... 7530 1727096033.61186: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096033.61473: in run() - task 0afff68d-5257-086b-f4f0-00000000006d 7530 1727096033.61613: variable 'ansible_search_path' from source: unknown 7530 1727096033.61623: variable 'ansible_search_path' from source: unknown 7530 1727096033.61675: calling self._execute() 7530 1727096033.62052: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.62056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.62059: variable 'omit' from source: magic vars 7530 1727096033.62863: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.62875: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096033.63229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096033.67275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096033.67281: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096033.67285: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096033.67674: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096033.67678: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096033.67682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.67685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.67688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.68073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.68078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.68080: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.68082: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7530 1727096033.68333: variable 'ansible_distribution' from source: facts 7530 1727096033.68343: variable '__network_rh_distros' from source: role '' defaults 7530 1727096033.68357: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7530 1727096033.68820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.69101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.69273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.69277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.69280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.69283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.69285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.69294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.69338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.69773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.69776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.69780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.69784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.69787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.69790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.70281: variable 'network_connections' from source: task vars 7530 1727096033.70488: variable 'interface' from source: play vars 7530 1727096033.70563: variable 'interface' from source: play vars 7530 1727096033.70583: variable 'network_state' from source: role '' defaults 7530 1727096033.70660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096033.71065: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096033.71315: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096033.71352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096033.71386: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096033.71437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096033.71465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096033.71773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.71776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096033.71779: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7530 1727096033.71781: when evaluation is False, skipping this task 7530 1727096033.71783: _execute() done 7530 1727096033.71786: dumping result to json 7530 1727096033.71788: done dumping result, returning 7530 1727096033.71790: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-086b-f4f0-00000000006d] 7530 1727096033.71799: sending task result for task 0afff68d-5257-086b-f4f0-00000000006d skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7530 1727096033.71951: no more pending results, returning what we have 7530 1727096033.71954: results queue empty 7530 1727096033.71955: checking for any_errors_fatal 7530 1727096033.71962: done checking for any_errors_fatal 7530 1727096033.71963: checking for max_fail_percentage 7530 1727096033.71965: done checking for max_fail_percentage 7530 1727096033.71966: checking to see if all hosts have failed and the running result is not ok 7530 1727096033.71968: done checking to see if all hosts have failed 7530 1727096033.71969: getting the remaining hosts for this loop 7530 1727096033.71971: done getting the remaining hosts for this loop 7530 1727096033.71974: getting the next task for host managed_node3 7530 1727096033.71980: done getting next task for host managed_node3 7530 1727096033.71984: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096033.71986: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096033.72187: getting variables 7530 1727096033.72189: in VariableManager get_vars() 7530 1727096033.72245: Calling all_inventory to load vars for managed_node3 7530 1727096033.72249: Calling groups_inventory to load vars for managed_node3 7530 1727096033.72251: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096033.72262: Calling all_plugins_play to load vars for managed_node3 7530 1727096033.72265: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096033.72271: Calling groups_plugins_play to load vars for managed_node3 7530 1727096033.72281: done sending task result for task 0afff68d-5257-086b-f4f0-00000000006d 7530 1727096033.72284: WORKER PROCESS EXITING 7530 1727096033.74799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.77829: done with get_vars() 7530 1727096033.77856: done getting variables 7530 1727096033.77924: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:53:53 -0400 (0:00:00.179) 0:00:24.567 ****** 7530 1727096033.77962: entering _queue_task() for managed_node3/dnf 7530 1727096033.78355: worker is 1 (out of 1 available) 7530 1727096033.78517: exiting _queue_task() for managed_node3/dnf 7530 1727096033.78532: done queuing things up, now waiting for results queue to drain 7530 1727096033.78534: waiting for pending results... 7530 1727096033.78711: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096033.78888: in run() - task 0afff68d-5257-086b-f4f0-00000000006e 7530 1727096033.78909: variable 'ansible_search_path' from source: unknown 7530 1727096033.78919: variable 'ansible_search_path' from source: unknown 7530 1727096033.78977: calling self._execute() 7530 1727096033.79101: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.79113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.79133: variable 'omit' from source: magic vars 7530 1727096033.79599: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.79629: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096033.79910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096033.82499: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096033.82598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096033.82645: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096033.82694: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096033.82733: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096033.82835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.82873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.82913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.82960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.82983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.83139: variable 'ansible_distribution' from source: facts 7530 1727096033.83149: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.83213: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7530 1727096033.83304: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096033.83464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.83494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.83523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.83582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.83603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.83761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.83765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.83770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.83773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.83782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.83833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.83870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.83902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.83948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.83975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.84302: variable 'network_connections' from source: task vars 7530 1727096033.84306: variable 'interface' from source: play vars 7530 1727096033.84309: variable 'interface' from source: play vars 7530 1727096033.84439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096033.84851: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096033.85071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096033.85075: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096033.85285: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096033.85290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096033.85294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096033.85575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.85579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096033.85585: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096033.86207: variable 'network_connections' from source: task vars 7530 1727096033.86374: variable 'interface' from source: play vars 7530 1727096033.86409: variable 'interface' from source: play vars 7530 1727096033.86502: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096033.86510: when evaluation is False, skipping this task 7530 1727096033.86517: _execute() done 7530 1727096033.86530: dumping result to json 7530 1727096033.86541: done dumping result, returning 7530 1727096033.86677: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-00000000006e] 7530 1727096033.86680: sending task result for task 0afff68d-5257-086b-f4f0-00000000006e 7530 1727096033.87075: done sending task result for task 0afff68d-5257-086b-f4f0-00000000006e 7530 1727096033.87079: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096033.87138: no more pending results, returning what we have 7530 1727096033.87141: results queue empty 7530 1727096033.87142: checking for any_errors_fatal 7530 1727096033.87149: done checking for any_errors_fatal 7530 1727096033.87150: checking for max_fail_percentage 7530 1727096033.87152: done checking for max_fail_percentage 7530 1727096033.87152: checking to see if all hosts have failed and the running result is not ok 7530 1727096033.87154: done checking to see if all hosts have failed 7530 1727096033.87154: getting the remaining hosts for this loop 7530 1727096033.87156: done getting the remaining hosts for this loop 7530 1727096033.87160: getting the next task for host managed_node3 7530 1727096033.87166: done getting next task for host managed_node3 7530 1727096033.87172: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096033.87175: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096033.87195: getting variables 7530 1727096033.87196: in VariableManager get_vars() 7530 1727096033.87251: Calling all_inventory to load vars for managed_node3 7530 1727096033.87255: Calling groups_inventory to load vars for managed_node3 7530 1727096033.87257: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096033.87474: Calling all_plugins_play to load vars for managed_node3 7530 1727096033.87479: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096033.87483: Calling groups_plugins_play to load vars for managed_node3 7530 1727096033.89272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096033.90892: done with get_vars() 7530 1727096033.90932: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096033.91013: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:53:53 -0400 (0:00:00.130) 0:00:24.698 ****** 7530 1727096033.91053: entering _queue_task() for managed_node3/yum 7530 1727096033.91512: worker is 1 (out of 1 available) 7530 1727096033.91524: exiting _queue_task() for managed_node3/yum 7530 1727096033.91539: done queuing things up, now waiting for results queue to drain 7530 1727096033.91540: waiting for pending results... 7530 1727096033.91763: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096033.91923: in run() - task 0afff68d-5257-086b-f4f0-00000000006f 7530 1727096033.91950: variable 'ansible_search_path' from source: unknown 7530 1727096033.91958: variable 'ansible_search_path' from source: unknown 7530 1727096033.92008: calling self._execute() 7530 1727096033.92150: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096033.92154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096033.92156: variable 'omit' from source: magic vars 7530 1727096033.92585: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.92604: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096033.92854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096033.95639: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096033.95930: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096033.95935: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096033.96175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096033.96179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096033.96295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096033.96332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096033.96578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096033.96581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096033.96584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096033.96817: variable 'ansible_distribution_major_version' from source: facts 7530 1727096033.97009: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7530 1727096033.97012: when evaluation is False, skipping this task 7530 1727096033.97015: _execute() done 7530 1727096033.97018: dumping result to json 7530 1727096033.97020: done dumping result, returning 7530 1727096033.97023: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-00000000006f] 7530 1727096033.97027: sending task result for task 0afff68d-5257-086b-f4f0-00000000006f 7530 1727096033.97101: done sending task result for task 0afff68d-5257-086b-f4f0-00000000006f 7530 1727096033.97103: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7530 1727096033.97158: no more pending results, returning what we have 7530 1727096033.97161: results queue empty 7530 1727096033.97162: checking for any_errors_fatal 7530 1727096033.97172: done checking for any_errors_fatal 7530 1727096033.97173: checking for max_fail_percentage 7530 1727096033.97176: done checking for max_fail_percentage 7530 1727096033.97177: checking to see if all hosts have failed and the running result is not ok 7530 1727096033.97178: done checking to see if all hosts have failed 7530 1727096033.97178: getting the remaining hosts for this loop 7530 1727096033.97180: done getting the remaining hosts for this loop 7530 1727096033.97183: getting the next task for host managed_node3 7530 1727096033.97190: done getting next task for host managed_node3 7530 1727096033.97194: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096033.97197: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096033.97215: getting variables 7530 1727096033.97216: in VariableManager get_vars() 7530 1727096033.97449: Calling all_inventory to load vars for managed_node3 7530 1727096033.97453: Calling groups_inventory to load vars for managed_node3 7530 1727096033.97456: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096033.97466: Calling all_plugins_play to load vars for managed_node3 7530 1727096033.97472: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096033.97475: Calling groups_plugins_play to load vars for managed_node3 7530 1727096034.00988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096034.04318: done with get_vars() 7530 1727096034.04354: done getting variables 7530 1727096034.04525: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:53:54 -0400 (0:00:00.135) 0:00:24.833 ****** 7530 1727096034.04562: entering _queue_task() for managed_node3/fail 7530 1727096034.05441: worker is 1 (out of 1 available) 7530 1727096034.05456: exiting _queue_task() for managed_node3/fail 7530 1727096034.05471: done queuing things up, now waiting for results queue to drain 7530 1727096034.05472: waiting for pending results... 7530 1727096034.06198: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096034.06402: in run() - task 0afff68d-5257-086b-f4f0-000000000070 7530 1727096034.06407: variable 'ansible_search_path' from source: unknown 7530 1727096034.06473: variable 'ansible_search_path' from source: unknown 7530 1727096034.06478: calling self._execute() 7530 1727096034.06702: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.06714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.06765: variable 'omit' from source: magic vars 7530 1727096034.07676: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.07681: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096034.08030: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096034.08328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096034.13222: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096034.13455: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096034.13522: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096034.13559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096034.13590: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096034.13664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.13898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.13921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.13962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.14197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.14314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.14318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.14320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.14423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.14426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.14672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.14675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.14678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.14694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.15075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.15473: variable 'network_connections' from source: task vars 7530 1727096034.15478: variable 'interface' from source: play vars 7530 1727096034.15481: variable 'interface' from source: play vars 7530 1727096034.15483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096034.15796: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096034.16005: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096034.16084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096034.16119: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096034.16174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096034.16205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096034.16239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.16275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096034.16334: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096034.16588: variable 'network_connections' from source: task vars 7530 1727096034.16598: variable 'interface' from source: play vars 7530 1727096034.16670: variable 'interface' from source: play vars 7530 1727096034.16701: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096034.16710: when evaluation is False, skipping this task 7530 1727096034.16717: _execute() done 7530 1727096034.16724: dumping result to json 7530 1727096034.16736: done dumping result, returning 7530 1727096034.16750: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-000000000070] 7530 1727096034.16761: sending task result for task 0afff68d-5257-086b-f4f0-000000000070 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096034.16925: no more pending results, returning what we have 7530 1727096034.16928: results queue empty 7530 1727096034.16930: checking for any_errors_fatal 7530 1727096034.16934: done checking for any_errors_fatal 7530 1727096034.16935: checking for max_fail_percentage 7530 1727096034.16937: done checking for max_fail_percentage 7530 1727096034.16938: checking to see if all hosts have failed and the running result is not ok 7530 1727096034.16939: done checking to see if all hosts have failed 7530 1727096034.16939: getting the remaining hosts for this loop 7530 1727096034.16941: done getting the remaining hosts for this loop 7530 1727096034.16944: getting the next task for host managed_node3 7530 1727096034.16951: done getting next task for host managed_node3 7530 1727096034.16954: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7530 1727096034.16957: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096034.16986: getting variables 7530 1727096034.16987: in VariableManager get_vars() 7530 1727096034.17036: Calling all_inventory to load vars for managed_node3 7530 1727096034.17039: Calling groups_inventory to load vars for managed_node3 7530 1727096034.17041: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096034.17053: Calling all_plugins_play to load vars for managed_node3 7530 1727096034.17055: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096034.17059: Calling groups_plugins_play to load vars for managed_node3 7530 1727096034.17586: done sending task result for task 0afff68d-5257-086b-f4f0-000000000070 7530 1727096034.17591: WORKER PROCESS EXITING 7530 1727096034.18744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096034.21240: done with get_vars() 7530 1727096034.21374: done getting variables 7530 1727096034.21464: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:53:54 -0400 (0:00:00.169) 0:00:25.003 ****** 7530 1727096034.21504: entering _queue_task() for managed_node3/package 7530 1727096034.21858: worker is 1 (out of 1 available) 7530 1727096034.21874: exiting _queue_task() for managed_node3/package 7530 1727096034.21885: done queuing things up, now waiting for results queue to drain 7530 1727096034.21887: waiting for pending results... 7530 1727096034.22153: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7530 1727096034.22314: in run() - task 0afff68d-5257-086b-f4f0-000000000071 7530 1727096034.22340: variable 'ansible_search_path' from source: unknown 7530 1727096034.22349: variable 'ansible_search_path' from source: unknown 7530 1727096034.22397: calling self._execute() 7530 1727096034.22509: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.22521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.22537: variable 'omit' from source: magic vars 7530 1727096034.22919: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.22940: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096034.23158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096034.23446: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096034.23537: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096034.23975: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096034.23979: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096034.24127: variable 'network_packages' from source: role '' defaults 7530 1727096034.24360: variable '__network_provider_setup' from source: role '' defaults 7530 1727096034.24435: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096034.24634: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096034.24653: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096034.24786: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096034.24951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096034.27543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096034.27624: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096034.27671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096034.27792: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096034.27795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096034.27835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.27871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.27907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.27951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.27976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.28122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.28153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.28186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.28238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.28260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.28873: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096034.29023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.29075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.29227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.29444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.29447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.29580: variable 'ansible_python' from source: facts 7530 1727096034.29613: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096034.29824: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096034.29915: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096034.30229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.30339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.30375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.30525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.30546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.30598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.30639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.30687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.30733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.30772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.31166: variable 'network_connections' from source: task vars 7530 1727096034.31172: variable 'interface' from source: play vars 7530 1727096034.31243: variable 'interface' from source: play vars 7530 1727096034.31378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096034.31382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096034.31426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.31461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096034.31529: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096034.31854: variable 'network_connections' from source: task vars 7530 1727096034.31865: variable 'interface' from source: play vars 7530 1727096034.32033: variable 'interface' from source: play vars 7530 1727096034.32036: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096034.32114: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096034.32483: variable 'network_connections' from source: task vars 7530 1727096034.32518: variable 'interface' from source: play vars 7530 1727096034.32660: variable 'interface' from source: play vars 7530 1727096034.32738: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096034.32815: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096034.33164: variable 'network_connections' from source: task vars 7530 1727096034.33179: variable 'interface' from source: play vars 7530 1727096034.33342: variable 'interface' from source: play vars 7530 1727096034.33346: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096034.33396: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096034.33410: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096034.33481: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096034.33723: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096034.34288: variable 'network_connections' from source: task vars 7530 1727096034.34301: variable 'interface' from source: play vars 7530 1727096034.34384: variable 'interface' from source: play vars 7530 1727096034.34398: variable 'ansible_distribution' from source: facts 7530 1727096034.34407: variable '__network_rh_distros' from source: role '' defaults 7530 1727096034.34418: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.34448: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096034.34707: variable 'ansible_distribution' from source: facts 7530 1727096034.34758: variable '__network_rh_distros' from source: role '' defaults 7530 1727096034.34762: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.34764: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096034.34953: variable 'ansible_distribution' from source: facts 7530 1727096034.34963: variable '__network_rh_distros' from source: role '' defaults 7530 1727096034.34984: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.35029: variable 'network_provider' from source: set_fact 7530 1727096034.35050: variable 'ansible_facts' from source: unknown 7530 1727096034.35847: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7530 1727096034.35850: when evaluation is False, skipping this task 7530 1727096034.35852: _execute() done 7530 1727096034.35854: dumping result to json 7530 1727096034.35856: done dumping result, returning 7530 1727096034.35858: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-086b-f4f0-000000000071] 7530 1727096034.35860: sending task result for task 0afff68d-5257-086b-f4f0-000000000071 7530 1727096034.35931: done sending task result for task 0afff68d-5257-086b-f4f0-000000000071 7530 1727096034.35934: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7530 1727096034.36219: no more pending results, returning what we have 7530 1727096034.36222: results queue empty 7530 1727096034.36223: checking for any_errors_fatal 7530 1727096034.36229: done checking for any_errors_fatal 7530 1727096034.36230: checking for max_fail_percentage 7530 1727096034.36231: done checking for max_fail_percentage 7530 1727096034.36232: checking to see if all hosts have failed and the running result is not ok 7530 1727096034.36233: done checking to see if all hosts have failed 7530 1727096034.36234: getting the remaining hosts for this loop 7530 1727096034.36235: done getting the remaining hosts for this loop 7530 1727096034.36239: getting the next task for host managed_node3 7530 1727096034.36246: done getting next task for host managed_node3 7530 1727096034.36250: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096034.36252: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096034.36273: getting variables 7530 1727096034.36275: in VariableManager get_vars() 7530 1727096034.36331: Calling all_inventory to load vars for managed_node3 7530 1727096034.36335: Calling groups_inventory to load vars for managed_node3 7530 1727096034.36337: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096034.36349: Calling all_plugins_play to load vars for managed_node3 7530 1727096034.36352: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096034.36355: Calling groups_plugins_play to load vars for managed_node3 7530 1727096034.38346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096034.39986: done with get_vars() 7530 1727096034.40021: done getting variables 7530 1727096034.40096: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:53:54 -0400 (0:00:00.186) 0:00:25.189 ****** 7530 1727096034.40131: entering _queue_task() for managed_node3/package 7530 1727096034.40684: worker is 1 (out of 1 available) 7530 1727096034.40696: exiting _queue_task() for managed_node3/package 7530 1727096034.40708: done queuing things up, now waiting for results queue to drain 7530 1727096034.40710: waiting for pending results... 7530 1727096034.40951: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096034.41003: in run() - task 0afff68d-5257-086b-f4f0-000000000072 7530 1727096034.41047: variable 'ansible_search_path' from source: unknown 7530 1727096034.41051: variable 'ansible_search_path' from source: unknown 7530 1727096034.41157: calling self._execute() 7530 1727096034.41212: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.41223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.41238: variable 'omit' from source: magic vars 7530 1727096034.41659: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.41682: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096034.41827: variable 'network_state' from source: role '' defaults 7530 1727096034.41846: Evaluated conditional (network_state != {}): False 7530 1727096034.41855: when evaluation is False, skipping this task 7530 1727096034.41863: _execute() done 7530 1727096034.41874: dumping result to json 7530 1727096034.41882: done dumping result, returning 7530 1727096034.41921: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-086b-f4f0-000000000072] 7530 1727096034.41925: sending task result for task 0afff68d-5257-086b-f4f0-000000000072 7530 1727096034.42187: done sending task result for task 0afff68d-5257-086b-f4f0-000000000072 7530 1727096034.42191: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096034.42244: no more pending results, returning what we have 7530 1727096034.42249: results queue empty 7530 1727096034.42250: checking for any_errors_fatal 7530 1727096034.42258: done checking for any_errors_fatal 7530 1727096034.42259: checking for max_fail_percentage 7530 1727096034.42261: done checking for max_fail_percentage 7530 1727096034.42262: checking to see if all hosts have failed and the running result is not ok 7530 1727096034.42263: done checking to see if all hosts have failed 7530 1727096034.42263: getting the remaining hosts for this loop 7530 1727096034.42265: done getting the remaining hosts for this loop 7530 1727096034.42273: getting the next task for host managed_node3 7530 1727096034.42281: done getting next task for host managed_node3 7530 1727096034.42285: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096034.42293: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096034.42316: getting variables 7530 1727096034.42318: in VariableManager get_vars() 7530 1727096034.42406: Calling all_inventory to load vars for managed_node3 7530 1727096034.42409: Calling groups_inventory to load vars for managed_node3 7530 1727096034.42412: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096034.42424: Calling all_plugins_play to load vars for managed_node3 7530 1727096034.42427: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096034.42430: Calling groups_plugins_play to load vars for managed_node3 7530 1727096034.43879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096034.45517: done with get_vars() 7530 1727096034.45548: done getting variables 7530 1727096034.45618: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:53:54 -0400 (0:00:00.055) 0:00:25.244 ****** 7530 1727096034.45652: entering _queue_task() for managed_node3/package 7530 1727096034.46012: worker is 1 (out of 1 available) 7530 1727096034.46024: exiting _queue_task() for managed_node3/package 7530 1727096034.46037: done queuing things up, now waiting for results queue to drain 7530 1727096034.46038: waiting for pending results... 7530 1727096034.46399: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096034.46504: in run() - task 0afff68d-5257-086b-f4f0-000000000073 7530 1727096034.46525: variable 'ansible_search_path' from source: unknown 7530 1727096034.46535: variable 'ansible_search_path' from source: unknown 7530 1727096034.46579: calling self._execute() 7530 1727096034.46688: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.46700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.46821: variable 'omit' from source: magic vars 7530 1727096034.47129: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.47157: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096034.47292: variable 'network_state' from source: role '' defaults 7530 1727096034.47308: Evaluated conditional (network_state != {}): False 7530 1727096034.47315: when evaluation is False, skipping this task 7530 1727096034.47322: _execute() done 7530 1727096034.47328: dumping result to json 7530 1727096034.47335: done dumping result, returning 7530 1727096034.47347: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-086b-f4f0-000000000073] 7530 1727096034.47357: sending task result for task 0afff68d-5257-086b-f4f0-000000000073 7530 1727096034.47583: done sending task result for task 0afff68d-5257-086b-f4f0-000000000073 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096034.47638: no more pending results, returning what we have 7530 1727096034.47642: results queue empty 7530 1727096034.47643: checking for any_errors_fatal 7530 1727096034.47651: done checking for any_errors_fatal 7530 1727096034.47652: checking for max_fail_percentage 7530 1727096034.47654: done checking for max_fail_percentage 7530 1727096034.47655: checking to see if all hosts have failed and the running result is not ok 7530 1727096034.47657: done checking to see if all hosts have failed 7530 1727096034.47657: getting the remaining hosts for this loop 7530 1727096034.47659: done getting the remaining hosts for this loop 7530 1727096034.47663: getting the next task for host managed_node3 7530 1727096034.47673: done getting next task for host managed_node3 7530 1727096034.47677: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096034.47680: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096034.47701: getting variables 7530 1727096034.47703: in VariableManager get_vars() 7530 1727096034.47759: Calling all_inventory to load vars for managed_node3 7530 1727096034.47763: Calling groups_inventory to load vars for managed_node3 7530 1727096034.47766: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096034.47820: Calling all_plugins_play to load vars for managed_node3 7530 1727096034.47823: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096034.47834: WORKER PROCESS EXITING 7530 1727096034.47838: Calling groups_plugins_play to load vars for managed_node3 7530 1727096034.48793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096034.49666: done with get_vars() 7530 1727096034.49689: done getting variables 7530 1727096034.49752: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:53:54 -0400 (0:00:00.041) 0:00:25.286 ****** 7530 1727096034.49785: entering _queue_task() for managed_node3/service 7530 1727096034.50142: worker is 1 (out of 1 available) 7530 1727096034.50156: exiting _queue_task() for managed_node3/service 7530 1727096034.50169: done queuing things up, now waiting for results queue to drain 7530 1727096034.50171: waiting for pending results... 7530 1727096034.50492: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096034.50675: in run() - task 0afff68d-5257-086b-f4f0-000000000074 7530 1727096034.50680: variable 'ansible_search_path' from source: unknown 7530 1727096034.50683: variable 'ansible_search_path' from source: unknown 7530 1727096034.50689: calling self._execute() 7530 1727096034.50803: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.50816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.50835: variable 'omit' from source: magic vars 7530 1727096034.51175: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.51187: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096034.51274: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096034.51410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096034.52942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096034.53047: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096034.53051: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096034.53070: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096034.53172: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096034.53189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.53223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.53252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.53298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.53319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.53384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.53415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.53444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.53489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.53509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.53553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.53583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.53775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.53778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.53781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.53876: variable 'network_connections' from source: task vars 7530 1727096034.53903: variable 'interface' from source: play vars 7530 1727096034.53995: variable 'interface' from source: play vars 7530 1727096034.54078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096034.54277: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096034.54348: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096034.54384: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096034.54420: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096034.54481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096034.54505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096034.54543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.54561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096034.54611: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096034.54802: variable 'network_connections' from source: task vars 7530 1727096034.54805: variable 'interface' from source: play vars 7530 1727096034.54857: variable 'interface' from source: play vars 7530 1727096034.54879: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096034.54883: when evaluation is False, skipping this task 7530 1727096034.54886: _execute() done 7530 1727096034.54888: dumping result to json 7530 1727096034.54890: done dumping result, returning 7530 1727096034.54897: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-000000000074] 7530 1727096034.54902: sending task result for task 0afff68d-5257-086b-f4f0-000000000074 7530 1727096034.54986: done sending task result for task 0afff68d-5257-086b-f4f0-000000000074 7530 1727096034.54997: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096034.55044: no more pending results, returning what we have 7530 1727096034.55048: results queue empty 7530 1727096034.55049: checking for any_errors_fatal 7530 1727096034.55054: done checking for any_errors_fatal 7530 1727096034.55055: checking for max_fail_percentage 7530 1727096034.55057: done checking for max_fail_percentage 7530 1727096034.55058: checking to see if all hosts have failed and the running result is not ok 7530 1727096034.55059: done checking to see if all hosts have failed 7530 1727096034.55059: getting the remaining hosts for this loop 7530 1727096034.55061: done getting the remaining hosts for this loop 7530 1727096034.55064: getting the next task for host managed_node3 7530 1727096034.55075: done getting next task for host managed_node3 7530 1727096034.55078: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096034.55081: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096034.55097: getting variables 7530 1727096034.55099: in VariableManager get_vars() 7530 1727096034.55154: Calling all_inventory to load vars for managed_node3 7530 1727096034.55157: Calling groups_inventory to load vars for managed_node3 7530 1727096034.55159: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096034.55170: Calling all_plugins_play to load vars for managed_node3 7530 1727096034.55173: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096034.55176: Calling groups_plugins_play to load vars for managed_node3 7530 1727096034.55962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096034.57286: done with get_vars() 7530 1727096034.57309: done getting variables 7530 1727096034.57372: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:53:54 -0400 (0:00:00.076) 0:00:25.362 ****** 7530 1727096034.57405: entering _queue_task() for managed_node3/service 7530 1727096034.57728: worker is 1 (out of 1 available) 7530 1727096034.57743: exiting _queue_task() for managed_node3/service 7530 1727096034.57755: done queuing things up, now waiting for results queue to drain 7530 1727096034.57757: waiting for pending results... 7530 1727096034.58188: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096034.58197: in run() - task 0afff68d-5257-086b-f4f0-000000000075 7530 1727096034.58222: variable 'ansible_search_path' from source: unknown 7530 1727096034.58234: variable 'ansible_search_path' from source: unknown 7530 1727096034.58275: calling self._execute() 7530 1727096034.58405: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.58409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.58417: variable 'omit' from source: magic vars 7530 1727096034.58744: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.58754: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096034.58872: variable 'network_provider' from source: set_fact 7530 1727096034.58877: variable 'network_state' from source: role '' defaults 7530 1727096034.58889: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7530 1727096034.58894: variable 'omit' from source: magic vars 7530 1727096034.58937: variable 'omit' from source: magic vars 7530 1727096034.58958: variable 'network_service_name' from source: role '' defaults 7530 1727096034.59009: variable 'network_service_name' from source: role '' defaults 7530 1727096034.59083: variable '__network_provider_setup' from source: role '' defaults 7530 1727096034.59088: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096034.59137: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096034.59144: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096034.59189: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096034.59342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096034.61052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096034.61273: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096034.61277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096034.61279: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096034.61281: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096034.61336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.61405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.61408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.61427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.61441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.61479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.61495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.61512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.61542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.61552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.61707: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096034.61795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.61812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.61828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.61854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.61872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.61934: variable 'ansible_python' from source: facts 7530 1727096034.61952: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096034.62014: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096034.62071: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096034.62155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.62173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.62192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.62217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.62227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.62261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096034.62282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096034.62301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.62325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096034.62338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096034.62430: variable 'network_connections' from source: task vars 7530 1727096034.62439: variable 'interface' from source: play vars 7530 1727096034.62492: variable 'interface' from source: play vars 7530 1727096034.62566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096034.62708: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096034.62748: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096034.62780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096034.62810: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096034.62858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096034.62883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096034.62906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096034.62928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096034.62970: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096034.63152: variable 'network_connections' from source: task vars 7530 1727096034.63157: variable 'interface' from source: play vars 7530 1727096034.63214: variable 'interface' from source: play vars 7530 1727096034.63242: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096034.63300: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096034.63488: variable 'network_connections' from source: task vars 7530 1727096034.63492: variable 'interface' from source: play vars 7530 1727096034.63544: variable 'interface' from source: play vars 7530 1727096034.63561: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096034.63618: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096034.63802: variable 'network_connections' from source: task vars 7530 1727096034.63806: variable 'interface' from source: play vars 7530 1727096034.63860: variable 'interface' from source: play vars 7530 1727096034.63897: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096034.63943: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096034.63949: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096034.63992: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096034.64126: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096034.64452: variable 'network_connections' from source: task vars 7530 1727096034.64456: variable 'interface' from source: play vars 7530 1727096034.64503: variable 'interface' from source: play vars 7530 1727096034.64509: variable 'ansible_distribution' from source: facts 7530 1727096034.64512: variable '__network_rh_distros' from source: role '' defaults 7530 1727096034.64519: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.64532: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096034.64649: variable 'ansible_distribution' from source: facts 7530 1727096034.64652: variable '__network_rh_distros' from source: role '' defaults 7530 1727096034.64656: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.64669: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096034.64781: variable 'ansible_distribution' from source: facts 7530 1727096034.64785: variable '__network_rh_distros' from source: role '' defaults 7530 1727096034.64787: variable 'ansible_distribution_major_version' from source: facts 7530 1727096034.64817: variable 'network_provider' from source: set_fact 7530 1727096034.64839: variable 'omit' from source: magic vars 7530 1727096034.64862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096034.64886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096034.64900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096034.64914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096034.64924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096034.64948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096034.64951: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.64953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.65020: Set connection var ansible_pipelining to False 7530 1727096034.65025: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096034.65035: Set connection var ansible_timeout to 10 7530 1727096034.65042: Set connection var ansible_shell_executable to /bin/sh 7530 1727096034.65045: Set connection var ansible_shell_type to sh 7530 1727096034.65047: Set connection var ansible_connection to ssh 7530 1727096034.65066: variable 'ansible_shell_executable' from source: unknown 7530 1727096034.65071: variable 'ansible_connection' from source: unknown 7530 1727096034.65073: variable 'ansible_module_compression' from source: unknown 7530 1727096034.65075: variable 'ansible_shell_type' from source: unknown 7530 1727096034.65077: variable 'ansible_shell_executable' from source: unknown 7530 1727096034.65079: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096034.65084: variable 'ansible_pipelining' from source: unknown 7530 1727096034.65086: variable 'ansible_timeout' from source: unknown 7530 1727096034.65090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096034.65170: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096034.65178: variable 'omit' from source: magic vars 7530 1727096034.65186: starting attempt loop 7530 1727096034.65188: running the handler 7530 1727096034.65247: variable 'ansible_facts' from source: unknown 7530 1727096034.65728: _low_level_execute_command(): starting 7530 1727096034.65736: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096034.66241: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096034.66247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096034.66250: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096034.66302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096034.66305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096034.66307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096034.66351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096034.68057: stdout chunk (state=3): >>>/root <<< 7530 1727096034.68156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096034.68189: stderr chunk (state=3): >>><<< 7530 1727096034.68192: stdout chunk (state=3): >>><<< 7530 1727096034.68216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096034.68231: _low_level_execute_command(): starting 7530 1727096034.68236: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894 `" && echo ansible-tmp-1727096034.6821628-8487-54665832028894="` echo /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894 `" ) && sleep 0' 7530 1727096034.68707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096034.68711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096034.68713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096034.68715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096034.68717: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096034.68776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096034.68779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096034.68789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096034.68823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096034.70850: stdout chunk (state=3): >>>ansible-tmp-1727096034.6821628-8487-54665832028894=/root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894 <<< 7530 1727096034.70957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096034.70993: stderr chunk (state=3): >>><<< 7530 1727096034.70996: stdout chunk (state=3): >>><<< 7530 1727096034.71012: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096034.6821628-8487-54665832028894=/root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096034.71042: variable 'ansible_module_compression' from source: unknown 7530 1727096034.71089: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7530 1727096034.71143: variable 'ansible_facts' from source: unknown 7530 1727096034.71281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/AnsiballZ_systemd.py 7530 1727096034.71392: Sending initial data 7530 1727096034.71396: Sent initial data (153 bytes) 7530 1727096034.71835: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096034.71849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096034.71873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096034.71883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096034.71932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096034.71936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096034.71938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096034.71985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096034.73653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096034.73693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096034.73745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpwucndu3m /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/AnsiballZ_systemd.py <<< 7530 1727096034.73748: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/AnsiballZ_systemd.py" <<< 7530 1727096034.73779: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpwucndu3m" to remote "/root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/AnsiballZ_systemd.py" <<< 7530 1727096034.75297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096034.75472: stderr chunk (state=3): >>><<< 7530 1727096034.75476: stdout chunk (state=3): >>><<< 7530 1727096034.75478: done transferring module to remote 7530 1727096034.75480: _low_level_execute_command(): starting 7530 1727096034.75482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/ /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/AnsiballZ_systemd.py && sleep 0' 7530 1727096034.76060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096034.76079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096034.76094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096034.76113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096034.76139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096034.76152: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096034.76243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096034.76261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096034.76281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096034.76300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096034.76369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096034.78324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096034.78437: stderr chunk (state=3): >>><<< 7530 1727096034.78440: stdout chunk (state=3): >>><<< 7530 1727096034.78443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096034.78445: _low_level_execute_command(): starting 7530 1727096034.78447: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/AnsiballZ_systemd.py && sleep 0' 7530 1727096034.78994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096034.79003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096034.79014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096034.79028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096034.79046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096034.79053: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096034.79063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096034.79085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096034.79111: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096034.79119: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096034.79122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096034.79124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096034.79126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096034.79274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096034.79290: stderr chunk (state=3): >>>debug2: match found <<< 7530 1727096034.79293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096034.79295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096034.79297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096034.79299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096034.79342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.09864: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9490432", "MemoryPeak": "10018816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3340083200", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "114355000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", <<< 7530 1727096035.09872: stdout chunk (state=3): >>>"MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target Net<<< 7530 1727096035.09886: stdout chunk (state=3): >>>workManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7530 1727096035.11911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096035.11938: stderr chunk (state=3): >>><<< 7530 1727096035.11941: stdout chunk (state=3): >>><<< 7530 1727096035.11962: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9490432", "MemoryPeak": "10018816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3340083200", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "114355000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096035.12086: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096035.12103: _low_level_execute_command(): starting 7530 1727096035.12108: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096034.6821628-8487-54665832028894/ > /dev/null 2>&1 && sleep 0' 7530 1727096035.12573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096035.12577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.12579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096035.12582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096035.12589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.12629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096035.12645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096035.12690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.14584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096035.14606: stderr chunk (state=3): >>><<< 7530 1727096035.14610: stdout chunk (state=3): >>><<< 7530 1727096035.14623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096035.14631: handler run complete 7530 1727096035.14677: attempt loop complete, returning result 7530 1727096035.14684: _execute() done 7530 1727096035.14687: dumping result to json 7530 1727096035.14699: done dumping result, returning 7530 1727096035.14707: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-086b-f4f0-000000000075] 7530 1727096035.14711: sending task result for task 0afff68d-5257-086b-f4f0-000000000075 7530 1727096035.14949: done sending task result for task 0afff68d-5257-086b-f4f0-000000000075 7530 1727096035.14951: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096035.15010: no more pending results, returning what we have 7530 1727096035.15013: results queue empty 7530 1727096035.15014: checking for any_errors_fatal 7530 1727096035.15020: done checking for any_errors_fatal 7530 1727096035.15021: checking for max_fail_percentage 7530 1727096035.15022: done checking for max_fail_percentage 7530 1727096035.15023: checking to see if all hosts have failed and the running result is not ok 7530 1727096035.15024: done checking to see if all hosts have failed 7530 1727096035.15024: getting the remaining hosts for this loop 7530 1727096035.15028: done getting the remaining hosts for this loop 7530 1727096035.15031: getting the next task for host managed_node3 7530 1727096035.15037: done getting next task for host managed_node3 7530 1727096035.15040: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096035.15043: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096035.15052: getting variables 7530 1727096035.15054: in VariableManager get_vars() 7530 1727096035.15102: Calling all_inventory to load vars for managed_node3 7530 1727096035.15105: Calling groups_inventory to load vars for managed_node3 7530 1727096035.15107: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096035.15116: Calling all_plugins_play to load vars for managed_node3 7530 1727096035.15118: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096035.15120: Calling groups_plugins_play to load vars for managed_node3 7530 1727096035.15889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096035.16778: done with get_vars() 7530 1727096035.16803: done getting variables 7530 1727096035.16853: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:53:55 -0400 (0:00:00.594) 0:00:25.957 ****** 7530 1727096035.16884: entering _queue_task() for managed_node3/service 7530 1727096035.17146: worker is 1 (out of 1 available) 7530 1727096035.17159: exiting _queue_task() for managed_node3/service 7530 1727096035.17172: done queuing things up, now waiting for results queue to drain 7530 1727096035.17174: waiting for pending results... 7530 1727096035.17354: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096035.17453: in run() - task 0afff68d-5257-086b-f4f0-000000000076 7530 1727096035.17465: variable 'ansible_search_path' from source: unknown 7530 1727096035.17470: variable 'ansible_search_path' from source: unknown 7530 1727096035.17498: calling self._execute() 7530 1727096035.17581: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096035.17585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096035.17594: variable 'omit' from source: magic vars 7530 1727096035.17880: variable 'ansible_distribution_major_version' from source: facts 7530 1727096035.17889: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096035.17972: variable 'network_provider' from source: set_fact 7530 1727096035.17979: Evaluated conditional (network_provider == "nm"): True 7530 1727096035.18040: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096035.18104: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096035.18224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096035.19943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096035.19989: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096035.20018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096035.20044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096035.20064: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096035.20130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096035.20150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096035.20167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096035.20194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096035.20205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096035.20245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096035.20260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096035.20279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096035.20303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096035.20313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096035.20346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096035.20362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096035.20381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096035.20404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096035.20414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096035.20514: variable 'network_connections' from source: task vars 7530 1727096035.20529: variable 'interface' from source: play vars 7530 1727096035.20582: variable 'interface' from source: play vars 7530 1727096035.20635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096035.20765: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096035.20797: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096035.20818: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096035.20841: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096035.20872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096035.20892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096035.20909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096035.20928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096035.20965: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096035.21124: variable 'network_connections' from source: task vars 7530 1727096035.21131: variable 'interface' from source: play vars 7530 1727096035.21175: variable 'interface' from source: play vars 7530 1727096035.21197: Evaluated conditional (__network_wpa_supplicant_required): False 7530 1727096035.21201: when evaluation is False, skipping this task 7530 1727096035.21205: _execute() done 7530 1727096035.21208: dumping result to json 7530 1727096035.21210: done dumping result, returning 7530 1727096035.21220: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-086b-f4f0-000000000076] 7530 1727096035.21236: sending task result for task 0afff68d-5257-086b-f4f0-000000000076 7530 1727096035.21311: done sending task result for task 0afff68d-5257-086b-f4f0-000000000076 7530 1727096035.21314: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7530 1727096035.21363: no more pending results, returning what we have 7530 1727096035.21374: results queue empty 7530 1727096035.21376: checking for any_errors_fatal 7530 1727096035.21393: done checking for any_errors_fatal 7530 1727096035.21394: checking for max_fail_percentage 7530 1727096035.21395: done checking for max_fail_percentage 7530 1727096035.21396: checking to see if all hosts have failed and the running result is not ok 7530 1727096035.21397: done checking to see if all hosts have failed 7530 1727096035.21398: getting the remaining hosts for this loop 7530 1727096035.21400: done getting the remaining hosts for this loop 7530 1727096035.21404: getting the next task for host managed_node3 7530 1727096035.21411: done getting next task for host managed_node3 7530 1727096035.21415: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096035.21418: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096035.21437: getting variables 7530 1727096035.21439: in VariableManager get_vars() 7530 1727096035.21497: Calling all_inventory to load vars for managed_node3 7530 1727096035.21500: Calling groups_inventory to load vars for managed_node3 7530 1727096035.21502: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096035.21512: Calling all_plugins_play to load vars for managed_node3 7530 1727096035.21514: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096035.21517: Calling groups_plugins_play to load vars for managed_node3 7530 1727096035.22424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096035.23299: done with get_vars() 7530 1727096035.23322: done getting variables 7530 1727096035.23372: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:53:55 -0400 (0:00:00.065) 0:00:26.022 ****** 7530 1727096035.23397: entering _queue_task() for managed_node3/service 7530 1727096035.23660: worker is 1 (out of 1 available) 7530 1727096035.23674: exiting _queue_task() for managed_node3/service 7530 1727096035.23687: done queuing things up, now waiting for results queue to drain 7530 1727096035.23688: waiting for pending results... 7530 1727096035.23876: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096035.23970: in run() - task 0afff68d-5257-086b-f4f0-000000000077 7530 1727096035.23984: variable 'ansible_search_path' from source: unknown 7530 1727096035.23988: variable 'ansible_search_path' from source: unknown 7530 1727096035.24021: calling self._execute() 7530 1727096035.24102: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096035.24108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096035.24118: variable 'omit' from source: magic vars 7530 1727096035.24400: variable 'ansible_distribution_major_version' from source: facts 7530 1727096035.24410: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096035.24497: variable 'network_provider' from source: set_fact 7530 1727096035.24502: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096035.24504: when evaluation is False, skipping this task 7530 1727096035.24508: _execute() done 7530 1727096035.24510: dumping result to json 7530 1727096035.24512: done dumping result, returning 7530 1727096035.24521: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-086b-f4f0-000000000077] 7530 1727096035.24528: sending task result for task 0afff68d-5257-086b-f4f0-000000000077 7530 1727096035.24620: done sending task result for task 0afff68d-5257-086b-f4f0-000000000077 7530 1727096035.24623: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096035.24674: no more pending results, returning what we have 7530 1727096035.24678: results queue empty 7530 1727096035.24679: checking for any_errors_fatal 7530 1727096035.24686: done checking for any_errors_fatal 7530 1727096035.24687: checking for max_fail_percentage 7530 1727096035.24689: done checking for max_fail_percentage 7530 1727096035.24690: checking to see if all hosts have failed and the running result is not ok 7530 1727096035.24691: done checking to see if all hosts have failed 7530 1727096035.24692: getting the remaining hosts for this loop 7530 1727096035.24693: done getting the remaining hosts for this loop 7530 1727096035.24696: getting the next task for host managed_node3 7530 1727096035.24703: done getting next task for host managed_node3 7530 1727096035.24707: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096035.24710: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096035.24731: getting variables 7530 1727096035.24733: in VariableManager get_vars() 7530 1727096035.24781: Calling all_inventory to load vars for managed_node3 7530 1727096035.24783: Calling groups_inventory to load vars for managed_node3 7530 1727096035.24786: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096035.24794: Calling all_plugins_play to load vars for managed_node3 7530 1727096035.24797: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096035.24799: Calling groups_plugins_play to load vars for managed_node3 7530 1727096035.25583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096035.26589: done with get_vars() 7530 1727096035.26611: done getting variables 7530 1727096035.26660: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:53:55 -0400 (0:00:00.032) 0:00:26.055 ****** 7530 1727096035.26688: entering _queue_task() for managed_node3/copy 7530 1727096035.26952: worker is 1 (out of 1 available) 7530 1727096035.26966: exiting _queue_task() for managed_node3/copy 7530 1727096035.26979: done queuing things up, now waiting for results queue to drain 7530 1727096035.26981: waiting for pending results... 7530 1727096035.27165: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096035.27262: in run() - task 0afff68d-5257-086b-f4f0-000000000078 7530 1727096035.27276: variable 'ansible_search_path' from source: unknown 7530 1727096035.27280: variable 'ansible_search_path' from source: unknown 7530 1727096035.27312: calling self._execute() 7530 1727096035.27395: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096035.27399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096035.27408: variable 'omit' from source: magic vars 7530 1727096035.27704: variable 'ansible_distribution_major_version' from source: facts 7530 1727096035.27714: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096035.27798: variable 'network_provider' from source: set_fact 7530 1727096035.27802: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096035.27805: when evaluation is False, skipping this task 7530 1727096035.27808: _execute() done 7530 1727096035.27811: dumping result to json 7530 1727096035.27813: done dumping result, returning 7530 1727096035.27823: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-086b-f4f0-000000000078] 7530 1727096035.27825: sending task result for task 0afff68d-5257-086b-f4f0-000000000078 7530 1727096035.27918: done sending task result for task 0afff68d-5257-086b-f4f0-000000000078 7530 1727096035.27921: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7530 1727096035.27969: no more pending results, returning what we have 7530 1727096035.27972: results queue empty 7530 1727096035.27973: checking for any_errors_fatal 7530 1727096035.27980: done checking for any_errors_fatal 7530 1727096035.27980: checking for max_fail_percentage 7530 1727096035.27982: done checking for max_fail_percentage 7530 1727096035.27983: checking to see if all hosts have failed and the running result is not ok 7530 1727096035.27984: done checking to see if all hosts have failed 7530 1727096035.27985: getting the remaining hosts for this loop 7530 1727096035.27986: done getting the remaining hosts for this loop 7530 1727096035.27990: getting the next task for host managed_node3 7530 1727096035.27996: done getting next task for host managed_node3 7530 1727096035.28000: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096035.28003: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096035.28022: getting variables 7530 1727096035.28024: in VariableManager get_vars() 7530 1727096035.28071: Calling all_inventory to load vars for managed_node3 7530 1727096035.28075: Calling groups_inventory to load vars for managed_node3 7530 1727096035.28077: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096035.28086: Calling all_plugins_play to load vars for managed_node3 7530 1727096035.28088: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096035.28091: Calling groups_plugins_play to load vars for managed_node3 7530 1727096035.28858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096035.29716: done with get_vars() 7530 1727096035.29740: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:53:55 -0400 (0:00:00.031) 0:00:26.086 ****** 7530 1727096035.29810: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096035.30063: worker is 1 (out of 1 available) 7530 1727096035.30078: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096035.30092: done queuing things up, now waiting for results queue to drain 7530 1727096035.30093: waiting for pending results... 7530 1727096035.30280: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096035.30370: in run() - task 0afff68d-5257-086b-f4f0-000000000079 7530 1727096035.30383: variable 'ansible_search_path' from source: unknown 7530 1727096035.30386: variable 'ansible_search_path' from source: unknown 7530 1727096035.30415: calling self._execute() 7530 1727096035.30498: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096035.30502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096035.30512: variable 'omit' from source: magic vars 7530 1727096035.30798: variable 'ansible_distribution_major_version' from source: facts 7530 1727096035.30808: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096035.30814: variable 'omit' from source: magic vars 7530 1727096035.30856: variable 'omit' from source: magic vars 7530 1727096035.31075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096035.32466: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096035.32514: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096035.32545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096035.32572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096035.32592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096035.32658: variable 'network_provider' from source: set_fact 7530 1727096035.32762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096035.32797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096035.32818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096035.32847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096035.32858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096035.32912: variable 'omit' from source: magic vars 7530 1727096035.32999: variable 'omit' from source: magic vars 7530 1727096035.33077: variable 'network_connections' from source: task vars 7530 1727096035.33087: variable 'interface' from source: play vars 7530 1727096035.33133: variable 'interface' from source: play vars 7530 1727096035.33238: variable 'omit' from source: magic vars 7530 1727096035.33246: variable '__lsr_ansible_managed' from source: task vars 7530 1727096035.33292: variable '__lsr_ansible_managed' from source: task vars 7530 1727096035.33754: Loaded config def from plugin (lookup/template) 7530 1727096035.33758: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7530 1727096035.33782: File lookup term: get_ansible_managed.j2 7530 1727096035.33785: variable 'ansible_search_path' from source: unknown 7530 1727096035.33790: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7530 1727096035.33807: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7530 1727096035.33825: variable 'ansible_search_path' from source: unknown 7530 1727096035.37066: variable 'ansible_managed' from source: unknown 7530 1727096035.37162: variable 'omit' from source: magic vars 7530 1727096035.37193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096035.37215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096035.37232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096035.37244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096035.37252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096035.37282: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096035.37285: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096035.37287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096035.37351: Set connection var ansible_pipelining to False 7530 1727096035.37356: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096035.37361: Set connection var ansible_timeout to 10 7530 1727096035.37370: Set connection var ansible_shell_executable to /bin/sh 7530 1727096035.37373: Set connection var ansible_shell_type to sh 7530 1727096035.37377: Set connection var ansible_connection to ssh 7530 1727096035.37397: variable 'ansible_shell_executable' from source: unknown 7530 1727096035.37400: variable 'ansible_connection' from source: unknown 7530 1727096035.37403: variable 'ansible_module_compression' from source: unknown 7530 1727096035.37405: variable 'ansible_shell_type' from source: unknown 7530 1727096035.37407: variable 'ansible_shell_executable' from source: unknown 7530 1727096035.37409: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096035.37414: variable 'ansible_pipelining' from source: unknown 7530 1727096035.37416: variable 'ansible_timeout' from source: unknown 7530 1727096035.37420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096035.37518: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096035.37534: variable 'omit' from source: magic vars 7530 1727096035.37537: starting attempt loop 7530 1727096035.37539: running the handler 7530 1727096035.37548: _low_level_execute_command(): starting 7530 1727096035.37555: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096035.38076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096035.38081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096035.38084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096035.38087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.38131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096035.38134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096035.38136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096035.38192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.39889: stdout chunk (state=3): >>>/root <<< 7530 1727096035.39973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096035.40006: stderr chunk (state=3): >>><<< 7530 1727096035.40009: stdout chunk (state=3): >>><<< 7530 1727096035.40032: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096035.40044: _low_level_execute_command(): starting 7530 1727096035.40050: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876 `" && echo ansible-tmp-1727096035.4003298-8514-12452367621876="` echo /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876 `" ) && sleep 0' 7530 1727096035.40514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096035.40518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096035.40520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.40522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096035.40524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.40573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096035.40592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096035.40594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096035.40629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.42604: stdout chunk (state=3): >>>ansible-tmp-1727096035.4003298-8514-12452367621876=/root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876 <<< 7530 1727096035.42705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096035.42736: stderr chunk (state=3): >>><<< 7530 1727096035.42739: stdout chunk (state=3): >>><<< 7530 1727096035.42756: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096035.4003298-8514-12452367621876=/root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096035.42801: variable 'ansible_module_compression' from source: unknown 7530 1727096035.42842: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7530 1727096035.42884: variable 'ansible_facts' from source: unknown 7530 1727096035.42978: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/AnsiballZ_network_connections.py 7530 1727096035.43088: Sending initial data 7530 1727096035.43092: Sent initial data (165 bytes) 7530 1727096035.43556: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096035.43559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.43565: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096035.43570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.43617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096035.43620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096035.43623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096035.43665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.45312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096035.45332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096035.45363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpm3hywjzk /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/AnsiballZ_network_connections.py <<< 7530 1727096035.45371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/AnsiballZ_network_connections.py" <<< 7530 1727096035.45399: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpm3hywjzk" to remote "/root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/AnsiballZ_network_connections.py" <<< 7530 1727096035.45404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/AnsiballZ_network_connections.py" <<< 7530 1727096035.46101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096035.46143: stderr chunk (state=3): >>><<< 7530 1727096035.46147: stdout chunk (state=3): >>><<< 7530 1727096035.46177: done transferring module to remote 7530 1727096035.46184: _low_level_execute_command(): starting 7530 1727096035.46189: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/ /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/AnsiballZ_network_connections.py && sleep 0' 7530 1727096035.46646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096035.46649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.46652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096035.46655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.46706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096035.46710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096035.46714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096035.46752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.48603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096035.48627: stderr chunk (state=3): >>><<< 7530 1727096035.48631: stdout chunk (state=3): >>><<< 7530 1727096035.48652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096035.48655: _low_level_execute_command(): starting 7530 1727096035.48659: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/AnsiballZ_network_connections.py && sleep 0' 7530 1727096035.49121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096035.49125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.49127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096035.49130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096035.49132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.49187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096035.49203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096035.49206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096035.49238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.82784: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n38zowj9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n38zowj9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/863f1165-7589-4aec-bdb1-d3d32b99b3c3: error=unknown <<< 7530 1727096035.82942: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7530 1727096035.85123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096035.85128: stdout chunk (state=3): >>><<< 7530 1727096035.85130: stderr chunk (state=3): >>><<< 7530 1727096035.85173: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n38zowj9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n38zowj9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/863f1165-7589-4aec-bdb1-d3d32b99b3c3: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096035.85273: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096035.85276: _low_level_execute_command(): starting 7530 1727096035.85279: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096035.4003298-8514-12452367621876/ > /dev/null 2>&1 && sleep 0' 7530 1727096035.85890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096035.85958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096035.86024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096035.86075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096035.86106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096035.86137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096035.88083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096035.88088: stdout chunk (state=3): >>><<< 7530 1727096035.88090: stderr chunk (state=3): >>><<< 7530 1727096035.88109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096035.88173: handler run complete 7530 1727096035.88177: attempt loop complete, returning result 7530 1727096035.88181: _execute() done 7530 1727096035.88184: dumping result to json 7530 1727096035.88186: done dumping result, returning 7530 1727096035.88204: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-086b-f4f0-000000000079] 7530 1727096035.88214: sending task result for task 0afff68d-5257-086b-f4f0-000000000079 7530 1727096035.88527: done sending task result for task 0afff68d-5257-086b-f4f0-000000000079 7530 1727096035.88531: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7530 1727096035.88852: no more pending results, returning what we have 7530 1727096035.88856: results queue empty 7530 1727096035.88857: checking for any_errors_fatal 7530 1727096035.88864: done checking for any_errors_fatal 7530 1727096035.88865: checking for max_fail_percentage 7530 1727096035.88866: done checking for max_fail_percentage 7530 1727096035.88869: checking to see if all hosts have failed and the running result is not ok 7530 1727096035.88870: done checking to see if all hosts have failed 7530 1727096035.88871: getting the remaining hosts for this loop 7530 1727096035.88873: done getting the remaining hosts for this loop 7530 1727096035.88877: getting the next task for host managed_node3 7530 1727096035.88883: done getting next task for host managed_node3 7530 1727096035.88887: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096035.88890: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096035.88902: getting variables 7530 1727096035.88904: in VariableManager get_vars() 7530 1727096035.88950: Calling all_inventory to load vars for managed_node3 7530 1727096035.88960: Calling groups_inventory to load vars for managed_node3 7530 1727096035.88963: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096035.88975: Calling all_plugins_play to load vars for managed_node3 7530 1727096035.88977: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096035.88981: Calling groups_plugins_play to load vars for managed_node3 7530 1727096035.90548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096035.92214: done with get_vars() 7530 1727096035.92246: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:53:55 -0400 (0:00:00.625) 0:00:26.711 ****** 7530 1727096035.92335: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096035.92693: worker is 1 (out of 1 available) 7530 1727096035.92784: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096035.92796: done queuing things up, now waiting for results queue to drain 7530 1727096035.92798: waiting for pending results... 7530 1727096035.93049: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096035.93399: in run() - task 0afff68d-5257-086b-f4f0-00000000007a 7530 1727096035.93405: variable 'ansible_search_path' from source: unknown 7530 1727096035.93408: variable 'ansible_search_path' from source: unknown 7530 1727096035.93509: calling self._execute() 7530 1727096035.93697: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096035.93734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096035.93978: variable 'omit' from source: magic vars 7530 1727096035.94610: variable 'ansible_distribution_major_version' from source: facts 7530 1727096035.94738: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096035.94913: variable 'network_state' from source: role '' defaults 7530 1727096035.94982: Evaluated conditional (network_state != {}): False 7530 1727096035.95062: when evaluation is False, skipping this task 7530 1727096035.95072: _execute() done 7530 1727096035.95075: dumping result to json 7530 1727096035.95078: done dumping result, returning 7530 1727096035.95173: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-086b-f4f0-00000000007a] 7530 1727096035.95178: sending task result for task 0afff68d-5257-086b-f4f0-00000000007a 7530 1727096035.95254: done sending task result for task 0afff68d-5257-086b-f4f0-00000000007a 7530 1727096035.95257: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096035.95319: no more pending results, returning what we have 7530 1727096035.95323: results queue empty 7530 1727096035.95325: checking for any_errors_fatal 7530 1727096035.95334: done checking for any_errors_fatal 7530 1727096035.95335: checking for max_fail_percentage 7530 1727096035.95337: done checking for max_fail_percentage 7530 1727096035.95339: checking to see if all hosts have failed and the running result is not ok 7530 1727096035.95340: done checking to see if all hosts have failed 7530 1727096035.95341: getting the remaining hosts for this loop 7530 1727096035.95342: done getting the remaining hosts for this loop 7530 1727096035.95346: getting the next task for host managed_node3 7530 1727096035.95353: done getting next task for host managed_node3 7530 1727096035.95358: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096035.95362: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096035.95385: getting variables 7530 1727096035.95388: in VariableManager get_vars() 7530 1727096035.95444: Calling all_inventory to load vars for managed_node3 7530 1727096035.95447: Calling groups_inventory to load vars for managed_node3 7530 1727096035.95449: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096035.95461: Calling all_plugins_play to load vars for managed_node3 7530 1727096035.95464: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096035.95786: Calling groups_plugins_play to load vars for managed_node3 7530 1727096035.98629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.02025: done with get_vars() 7530 1727096036.02058: done getting variables 7530 1727096036.02243: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:53:56 -0400 (0:00:00.100) 0:00:26.812 ****** 7530 1727096036.02382: entering _queue_task() for managed_node3/debug 7530 1727096036.02895: worker is 1 (out of 1 available) 7530 1727096036.02907: exiting _queue_task() for managed_node3/debug 7530 1727096036.02919: done queuing things up, now waiting for results queue to drain 7530 1727096036.02920: waiting for pending results... 7530 1727096036.03385: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096036.03390: in run() - task 0afff68d-5257-086b-f4f0-00000000007b 7530 1727096036.03517: variable 'ansible_search_path' from source: unknown 7530 1727096036.03520: variable 'ansible_search_path' from source: unknown 7530 1727096036.03523: calling self._execute() 7530 1727096036.03572: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.03586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.03600: variable 'omit' from source: magic vars 7530 1727096036.03998: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.04015: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.04025: variable 'omit' from source: magic vars 7530 1727096036.04090: variable 'omit' from source: magic vars 7530 1727096036.04129: variable 'omit' from source: magic vars 7530 1727096036.04185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096036.04225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096036.04249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096036.04284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.04372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.04375: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096036.04389: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.04391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.04462: Set connection var ansible_pipelining to False 7530 1727096036.04476: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096036.04493: Set connection var ansible_timeout to 10 7530 1727096036.04511: Set connection var ansible_shell_executable to /bin/sh 7530 1727096036.04518: Set connection var ansible_shell_type to sh 7530 1727096036.04523: Set connection var ansible_connection to ssh 7530 1727096036.04551: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.04558: variable 'ansible_connection' from source: unknown 7530 1727096036.04564: variable 'ansible_module_compression' from source: unknown 7530 1727096036.04572: variable 'ansible_shell_type' from source: unknown 7530 1727096036.04579: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.04584: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.04591: variable 'ansible_pipelining' from source: unknown 7530 1727096036.04609: variable 'ansible_timeout' from source: unknown 7530 1727096036.04673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.04773: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096036.04792: variable 'omit' from source: magic vars 7530 1727096036.04803: starting attempt loop 7530 1727096036.04817: running the handler 7530 1727096036.04973: variable '__network_connections_result' from source: set_fact 7530 1727096036.05039: handler run complete 7530 1727096036.05071: attempt loop complete, returning result 7530 1727096036.05080: _execute() done 7530 1727096036.05150: dumping result to json 7530 1727096036.05157: done dumping result, returning 7530 1727096036.05160: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-086b-f4f0-00000000007b] 7530 1727096036.05163: sending task result for task 0afff68d-5257-086b-f4f0-00000000007b 7530 1727096036.05236: done sending task result for task 0afff68d-5257-086b-f4f0-00000000007b 7530 1727096036.05240: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7530 1727096036.05336: no more pending results, returning what we have 7530 1727096036.05340: results queue empty 7530 1727096036.05342: checking for any_errors_fatal 7530 1727096036.05349: done checking for any_errors_fatal 7530 1727096036.05350: checking for max_fail_percentage 7530 1727096036.05352: done checking for max_fail_percentage 7530 1727096036.05353: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.05354: done checking to see if all hosts have failed 7530 1727096036.05355: getting the remaining hosts for this loop 7530 1727096036.05364: done getting the remaining hosts for this loop 7530 1727096036.05371: getting the next task for host managed_node3 7530 1727096036.05378: done getting next task for host managed_node3 7530 1727096036.05382: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096036.05385: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.05398: getting variables 7530 1727096036.05400: in VariableManager get_vars() 7530 1727096036.05452: Calling all_inventory to load vars for managed_node3 7530 1727096036.05455: Calling groups_inventory to load vars for managed_node3 7530 1727096036.05457: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.05577: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.05582: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.05586: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.06788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.07642: done with get_vars() 7530 1727096036.07680: done getting variables 7530 1727096036.07749: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:53:56 -0400 (0:00:00.054) 0:00:26.866 ****** 7530 1727096036.07789: entering _queue_task() for managed_node3/debug 7530 1727096036.08155: worker is 1 (out of 1 available) 7530 1727096036.08373: exiting _queue_task() for managed_node3/debug 7530 1727096036.08384: done queuing things up, now waiting for results queue to drain 7530 1727096036.08386: waiting for pending results... 7530 1727096036.08591: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096036.08662: in run() - task 0afff68d-5257-086b-f4f0-00000000007c 7530 1727096036.08683: variable 'ansible_search_path' from source: unknown 7530 1727096036.08686: variable 'ansible_search_path' from source: unknown 7530 1727096036.08725: calling self._execute() 7530 1727096036.08880: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.08884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.08887: variable 'omit' from source: magic vars 7530 1727096036.09256: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.09375: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.09378: variable 'omit' from source: magic vars 7530 1727096036.09380: variable 'omit' from source: magic vars 7530 1727096036.09382: variable 'omit' from source: magic vars 7530 1727096036.09432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096036.09472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096036.09501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096036.09525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.09541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.09576: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096036.09585: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.09592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.09700: Set connection var ansible_pipelining to False 7530 1727096036.09717: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096036.09729: Set connection var ansible_timeout to 10 7530 1727096036.09744: Set connection var ansible_shell_executable to /bin/sh 7530 1727096036.09822: Set connection var ansible_shell_type to sh 7530 1727096036.09825: Set connection var ansible_connection to ssh 7530 1727096036.09827: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.09829: variable 'ansible_connection' from source: unknown 7530 1727096036.09832: variable 'ansible_module_compression' from source: unknown 7530 1727096036.09833: variable 'ansible_shell_type' from source: unknown 7530 1727096036.09835: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.09837: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.09839: variable 'ansible_pipelining' from source: unknown 7530 1727096036.09841: variable 'ansible_timeout' from source: unknown 7530 1727096036.09843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.09986: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096036.10021: variable 'omit' from source: magic vars 7530 1727096036.10024: starting attempt loop 7530 1727096036.10026: running the handler 7530 1727096036.10078: variable '__network_connections_result' from source: set_fact 7530 1727096036.10143: variable '__network_connections_result' from source: set_fact 7530 1727096036.10233: handler run complete 7530 1727096036.10252: attempt loop complete, returning result 7530 1727096036.10256: _execute() done 7530 1727096036.10259: dumping result to json 7530 1727096036.10266: done dumping result, returning 7530 1727096036.10274: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-086b-f4f0-00000000007c] 7530 1727096036.10277: sending task result for task 0afff68d-5257-086b-f4f0-00000000007c 7530 1727096036.10366: done sending task result for task 0afff68d-5257-086b-f4f0-00000000007c 7530 1727096036.10371: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7530 1727096036.10459: no more pending results, returning what we have 7530 1727096036.10463: results queue empty 7530 1727096036.10464: checking for any_errors_fatal 7530 1727096036.10472: done checking for any_errors_fatal 7530 1727096036.10473: checking for max_fail_percentage 7530 1727096036.10474: done checking for max_fail_percentage 7530 1727096036.10475: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.10476: done checking to see if all hosts have failed 7530 1727096036.10477: getting the remaining hosts for this loop 7530 1727096036.10478: done getting the remaining hosts for this loop 7530 1727096036.10488: getting the next task for host managed_node3 7530 1727096036.10494: done getting next task for host managed_node3 7530 1727096036.10497: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096036.10500: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.10510: getting variables 7530 1727096036.10512: in VariableManager get_vars() 7530 1727096036.10556: Calling all_inventory to load vars for managed_node3 7530 1727096036.10559: Calling groups_inventory to load vars for managed_node3 7530 1727096036.10560: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.10571: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.10574: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.10576: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.11365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.12492: done with get_vars() 7530 1727096036.12524: done getting variables 7530 1727096036.12590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:53:56 -0400 (0:00:00.048) 0:00:26.914 ****** 7530 1727096036.12635: entering _queue_task() for managed_node3/debug 7530 1727096036.13003: worker is 1 (out of 1 available) 7530 1727096036.13017: exiting _queue_task() for managed_node3/debug 7530 1727096036.13036: done queuing things up, now waiting for results queue to drain 7530 1727096036.13038: waiting for pending results... 7530 1727096036.13254: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096036.13357: in run() - task 0afff68d-5257-086b-f4f0-00000000007d 7530 1727096036.13370: variable 'ansible_search_path' from source: unknown 7530 1727096036.13374: variable 'ansible_search_path' from source: unknown 7530 1727096036.13405: calling self._execute() 7530 1727096036.13485: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.13489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.13499: variable 'omit' from source: magic vars 7530 1727096036.13792: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.13801: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.13891: variable 'network_state' from source: role '' defaults 7530 1727096036.13901: Evaluated conditional (network_state != {}): False 7530 1727096036.13904: when evaluation is False, skipping this task 7530 1727096036.13908: _execute() done 7530 1727096036.13910: dumping result to json 7530 1727096036.13912: done dumping result, returning 7530 1727096036.13921: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-086b-f4f0-00000000007d] 7530 1727096036.13931: sending task result for task 0afff68d-5257-086b-f4f0-00000000007d 7530 1727096036.14019: done sending task result for task 0afff68d-5257-086b-f4f0-00000000007d 7530 1727096036.14021: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7530 1727096036.14079: no more pending results, returning what we have 7530 1727096036.14083: results queue empty 7530 1727096036.14084: checking for any_errors_fatal 7530 1727096036.14091: done checking for any_errors_fatal 7530 1727096036.14092: checking for max_fail_percentage 7530 1727096036.14093: done checking for max_fail_percentage 7530 1727096036.14094: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.14095: done checking to see if all hosts have failed 7530 1727096036.14095: getting the remaining hosts for this loop 7530 1727096036.14097: done getting the remaining hosts for this loop 7530 1727096036.14100: getting the next task for host managed_node3 7530 1727096036.14107: done getting next task for host managed_node3 7530 1727096036.14111: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096036.14114: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.14134: getting variables 7530 1727096036.14136: in VariableManager get_vars() 7530 1727096036.14184: Calling all_inventory to load vars for managed_node3 7530 1727096036.14187: Calling groups_inventory to load vars for managed_node3 7530 1727096036.14189: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.14199: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.14201: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.14204: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.15073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.15945: done with get_vars() 7530 1727096036.15972: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:53:56 -0400 (0:00:00.034) 0:00:26.948 ****** 7530 1727096036.16048: entering _queue_task() for managed_node3/ping 7530 1727096036.16311: worker is 1 (out of 1 available) 7530 1727096036.16326: exiting _queue_task() for managed_node3/ping 7530 1727096036.16337: done queuing things up, now waiting for results queue to drain 7530 1727096036.16339: waiting for pending results... 7530 1727096036.16533: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096036.16638: in run() - task 0afff68d-5257-086b-f4f0-00000000007e 7530 1727096036.16651: variable 'ansible_search_path' from source: unknown 7530 1727096036.16655: variable 'ansible_search_path' from source: unknown 7530 1727096036.16688: calling self._execute() 7530 1727096036.16771: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.16774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.16785: variable 'omit' from source: magic vars 7530 1727096036.17082: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.17092: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.17098: variable 'omit' from source: magic vars 7530 1727096036.17142: variable 'omit' from source: magic vars 7530 1727096036.17170: variable 'omit' from source: magic vars 7530 1727096036.17206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096036.17238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096036.17254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096036.17270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.17280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.17303: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096036.17307: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.17309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.17391: Set connection var ansible_pipelining to False 7530 1727096036.17397: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096036.17403: Set connection var ansible_timeout to 10 7530 1727096036.17411: Set connection var ansible_shell_executable to /bin/sh 7530 1727096036.17414: Set connection var ansible_shell_type to sh 7530 1727096036.17416: Set connection var ansible_connection to ssh 7530 1727096036.17440: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.17444: variable 'ansible_connection' from source: unknown 7530 1727096036.17447: variable 'ansible_module_compression' from source: unknown 7530 1727096036.17449: variable 'ansible_shell_type' from source: unknown 7530 1727096036.17451: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.17453: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.17456: variable 'ansible_pipelining' from source: unknown 7530 1727096036.17458: variable 'ansible_timeout' from source: unknown 7530 1727096036.17463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.17616: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096036.17625: variable 'omit' from source: magic vars 7530 1727096036.17631: starting attempt loop 7530 1727096036.17634: running the handler 7530 1727096036.17645: _low_level_execute_command(): starting 7530 1727096036.17653: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096036.18185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.18190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.18195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.18244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.18249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.18295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.20010: stdout chunk (state=3): >>>/root <<< 7530 1727096036.20103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.20136: stderr chunk (state=3): >>><<< 7530 1727096036.20140: stdout chunk (state=3): >>><<< 7530 1727096036.20161: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096036.20175: _low_level_execute_command(): starting 7530 1727096036.20183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257 `" && echo ansible-tmp-1727096036.2016141-8543-199676756298257="` echo /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257 `" ) && sleep 0' 7530 1727096036.20650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.20654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.20657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096036.20670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.20713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.20716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.20719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.20761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.22762: stdout chunk (state=3): >>>ansible-tmp-1727096036.2016141-8543-199676756298257=/root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257 <<< 7530 1727096036.22859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.22892: stderr chunk (state=3): >>><<< 7530 1727096036.22895: stdout chunk (state=3): >>><<< 7530 1727096036.22911: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096036.2016141-8543-199676756298257=/root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096036.22959: variable 'ansible_module_compression' from source: unknown 7530 1727096036.22993: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7530 1727096036.23025: variable 'ansible_facts' from source: unknown 7530 1727096036.23084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/AnsiballZ_ping.py 7530 1727096036.23191: Sending initial data 7530 1727096036.23195: Sent initial data (151 bytes) 7530 1727096036.23662: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.23665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.23672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.23675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.23725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.23731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.23737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.23773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.25409: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096036.25434: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096036.25466: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp6lnu6hyl /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/AnsiballZ_ping.py <<< 7530 1727096036.25478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/AnsiballZ_ping.py" <<< 7530 1727096036.25499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp6lnu6hyl" to remote "/root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/AnsiballZ_ping.py" <<< 7530 1727096036.25506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/AnsiballZ_ping.py" <<< 7530 1727096036.25990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.26034: stderr chunk (state=3): >>><<< 7530 1727096036.26038: stdout chunk (state=3): >>><<< 7530 1727096036.26083: done transferring module to remote 7530 1727096036.26093: _low_level_execute_command(): starting 7530 1727096036.26099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/ /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/AnsiballZ_ping.py && sleep 0' 7530 1727096036.26550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096036.26555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.26567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.26627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.26637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.26640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.26677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.28526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.28554: stderr chunk (state=3): >>><<< 7530 1727096036.28558: stdout chunk (state=3): >>><<< 7530 1727096036.28575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096036.28577: _low_level_execute_command(): starting 7530 1727096036.28583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/AnsiballZ_ping.py && sleep 0' 7530 1727096036.29040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.29043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.29046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.29106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.29113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.29115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.29153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.45111: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7530 1727096036.46590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096036.46607: stderr chunk (state=3): >>><<< 7530 1727096036.46615: stdout chunk (state=3): >>><<< 7530 1727096036.46657: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096036.46785: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096036.46789: _low_level_execute_command(): starting 7530 1727096036.46791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096036.2016141-8543-199676756298257/ > /dev/null 2>&1 && sleep 0' 7530 1727096036.47300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.47345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.47406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.47433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.47494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.49354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.49384: stderr chunk (state=3): >>><<< 7530 1727096036.49388: stdout chunk (state=3): >>><<< 7530 1727096036.49403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096036.49412: handler run complete 7530 1727096036.49425: attempt loop complete, returning result 7530 1727096036.49431: _execute() done 7530 1727096036.49433: dumping result to json 7530 1727096036.49436: done dumping result, returning 7530 1727096036.49441: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-086b-f4f0-00000000007e] 7530 1727096036.49446: sending task result for task 0afff68d-5257-086b-f4f0-00000000007e 7530 1727096036.49538: done sending task result for task 0afff68d-5257-086b-f4f0-00000000007e 7530 1727096036.49540: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7530 1727096036.49601: no more pending results, returning what we have 7530 1727096036.49604: results queue empty 7530 1727096036.49605: checking for any_errors_fatal 7530 1727096036.49612: done checking for any_errors_fatal 7530 1727096036.49613: checking for max_fail_percentage 7530 1727096036.49614: done checking for max_fail_percentage 7530 1727096036.49615: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.49616: done checking to see if all hosts have failed 7530 1727096036.49617: getting the remaining hosts for this loop 7530 1727096036.49618: done getting the remaining hosts for this loop 7530 1727096036.49621: getting the next task for host managed_node3 7530 1727096036.49633: done getting next task for host managed_node3 7530 1727096036.49635: ^ task is: TASK: meta (role_complete) 7530 1727096036.49638: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.49649: getting variables 7530 1727096036.49651: in VariableManager get_vars() 7530 1727096036.49703: Calling all_inventory to load vars for managed_node3 7530 1727096036.49706: Calling groups_inventory to load vars for managed_node3 7530 1727096036.49708: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.49719: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.49721: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.49724: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.50996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.56497: done with get_vars() 7530 1727096036.56520: done getting variables 7530 1727096036.56578: done queuing things up, now waiting for results queue to drain 7530 1727096036.56580: results queue empty 7530 1727096036.56580: checking for any_errors_fatal 7530 1727096036.56582: done checking for any_errors_fatal 7530 1727096036.56583: checking for max_fail_percentage 7530 1727096036.56583: done checking for max_fail_percentage 7530 1727096036.56584: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.56585: done checking to see if all hosts have failed 7530 1727096036.56585: getting the remaining hosts for this loop 7530 1727096036.56586: done getting the remaining hosts for this loop 7530 1727096036.56589: getting the next task for host managed_node3 7530 1727096036.56592: done getting next task for host managed_node3 7530 1727096036.56593: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7530 1727096036.56595: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.56596: getting variables 7530 1727096036.56597: in VariableManager get_vars() 7530 1727096036.56611: Calling all_inventory to load vars for managed_node3 7530 1727096036.56612: Calling groups_inventory to load vars for managed_node3 7530 1727096036.56614: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.56617: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.56619: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.56620: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.57266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.58135: done with get_vars() 7530 1727096036.58156: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:79 Monday 23 September 2024 08:53:56 -0400 (0:00:00.421) 0:00:27.370 ****** 7530 1727096036.58213: entering _queue_task() for managed_node3/include_tasks 7530 1727096036.58483: worker is 1 (out of 1 available) 7530 1727096036.58497: exiting _queue_task() for managed_node3/include_tasks 7530 1727096036.58509: done queuing things up, now waiting for results queue to drain 7530 1727096036.58512: waiting for pending results... 7530 1727096036.58710: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7530 1727096036.58782: in run() - task 0afff68d-5257-086b-f4f0-0000000000ae 7530 1727096036.58794: variable 'ansible_search_path' from source: unknown 7530 1727096036.58829: calling self._execute() 7530 1727096036.58910: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.58915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.58924: variable 'omit' from source: magic vars 7530 1727096036.59229: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.59238: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.59245: _execute() done 7530 1727096036.59248: dumping result to json 7530 1727096036.59251: done dumping result, returning 7530 1727096036.59259: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-086b-f4f0-0000000000ae] 7530 1727096036.59264: sending task result for task 0afff68d-5257-086b-f4f0-0000000000ae 7530 1727096036.59370: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000ae 7530 1727096036.59373: WORKER PROCESS EXITING 7530 1727096036.59412: no more pending results, returning what we have 7530 1727096036.59416: in VariableManager get_vars() 7530 1727096036.59477: Calling all_inventory to load vars for managed_node3 7530 1727096036.59480: Calling groups_inventory to load vars for managed_node3 7530 1727096036.59484: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.59497: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.59500: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.59503: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.60459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.61323: done with get_vars() 7530 1727096036.61343: variable 'ansible_search_path' from source: unknown 7530 1727096036.61355: we have included files to process 7530 1727096036.61356: generating all_blocks data 7530 1727096036.61358: done generating all_blocks data 7530 1727096036.61362: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096036.61363: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096036.61364: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096036.61636: in VariableManager get_vars() 7530 1727096036.61657: done with get_vars() 7530 1727096036.62096: done processing included file 7530 1727096036.62098: iterating over new_blocks loaded from include file 7530 1727096036.62099: in VariableManager get_vars() 7530 1727096036.62114: done with get_vars() 7530 1727096036.62115: filtering new block on tags 7530 1727096036.62137: done filtering new block on tags 7530 1727096036.62138: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7530 1727096036.62143: extending task lists for all hosts with included blocks 7530 1727096036.64824: done extending task lists 7530 1727096036.64828: done processing included files 7530 1727096036.64829: results queue empty 7530 1727096036.64829: checking for any_errors_fatal 7530 1727096036.64831: done checking for any_errors_fatal 7530 1727096036.64831: checking for max_fail_percentage 7530 1727096036.64832: done checking for max_fail_percentage 7530 1727096036.64833: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.64833: done checking to see if all hosts have failed 7530 1727096036.64834: getting the remaining hosts for this loop 7530 1727096036.64835: done getting the remaining hosts for this loop 7530 1727096036.64836: getting the next task for host managed_node3 7530 1727096036.64839: done getting next task for host managed_node3 7530 1727096036.64841: ^ task is: TASK: Ensure state in ["present", "absent"] 7530 1727096036.64843: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.64845: getting variables 7530 1727096036.64845: in VariableManager get_vars() 7530 1727096036.64863: Calling all_inventory to load vars for managed_node3 7530 1727096036.64865: Calling groups_inventory to load vars for managed_node3 7530 1727096036.64866: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.64873: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.64875: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.64876: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.65547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.66537: done with get_vars() 7530 1727096036.66554: done getting variables 7530 1727096036.66591: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 08:53:56 -0400 (0:00:00.083) 0:00:27.454 ****** 7530 1727096036.66614: entering _queue_task() for managed_node3/fail 7530 1727096036.66890: worker is 1 (out of 1 available) 7530 1727096036.66903: exiting _queue_task() for managed_node3/fail 7530 1727096036.66915: done queuing things up, now waiting for results queue to drain 7530 1727096036.66917: waiting for pending results... 7530 1727096036.67097: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7530 1727096036.67170: in run() - task 0afff68d-5257-086b-f4f0-000000000dff 7530 1727096036.67183: variable 'ansible_search_path' from source: unknown 7530 1727096036.67187: variable 'ansible_search_path' from source: unknown 7530 1727096036.67215: calling self._execute() 7530 1727096036.67300: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.67304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.67312: variable 'omit' from source: magic vars 7530 1727096036.67597: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.67607: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.67701: variable 'state' from source: include params 7530 1727096036.67705: Evaluated conditional (state not in ["present", "absent"]): False 7530 1727096036.67708: when evaluation is False, skipping this task 7530 1727096036.67712: _execute() done 7530 1727096036.67715: dumping result to json 7530 1727096036.67717: done dumping result, returning 7530 1727096036.67724: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-086b-f4f0-000000000dff] 7530 1727096036.67732: sending task result for task 0afff68d-5257-086b-f4f0-000000000dff 7530 1727096036.67817: done sending task result for task 0afff68d-5257-086b-f4f0-000000000dff 7530 1727096036.67821: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7530 1727096036.67875: no more pending results, returning what we have 7530 1727096036.67879: results queue empty 7530 1727096036.67880: checking for any_errors_fatal 7530 1727096036.67882: done checking for any_errors_fatal 7530 1727096036.67883: checking for max_fail_percentage 7530 1727096036.67884: done checking for max_fail_percentage 7530 1727096036.67885: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.67886: done checking to see if all hosts have failed 7530 1727096036.67887: getting the remaining hosts for this loop 7530 1727096036.67888: done getting the remaining hosts for this loop 7530 1727096036.67892: getting the next task for host managed_node3 7530 1727096036.67898: done getting next task for host managed_node3 7530 1727096036.67900: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096036.67904: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.67908: getting variables 7530 1727096036.67910: in VariableManager get_vars() 7530 1727096036.67964: Calling all_inventory to load vars for managed_node3 7530 1727096036.67974: Calling groups_inventory to load vars for managed_node3 7530 1727096036.67977: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.67989: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.67992: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.67995: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.68777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.69653: done with get_vars() 7530 1727096036.69677: done getting variables 7530 1727096036.69725: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 08:53:56 -0400 (0:00:00.031) 0:00:27.485 ****** 7530 1727096036.69750: entering _queue_task() for managed_node3/fail 7530 1727096036.70013: worker is 1 (out of 1 available) 7530 1727096036.70030: exiting _queue_task() for managed_node3/fail 7530 1727096036.70042: done queuing things up, now waiting for results queue to drain 7530 1727096036.70044: waiting for pending results... 7530 1727096036.70230: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096036.70300: in run() - task 0afff68d-5257-086b-f4f0-000000000e00 7530 1727096036.70311: variable 'ansible_search_path' from source: unknown 7530 1727096036.70315: variable 'ansible_search_path' from source: unknown 7530 1727096036.70345: calling self._execute() 7530 1727096036.70429: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.70433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.70441: variable 'omit' from source: magic vars 7530 1727096036.70739: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.70749: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.70852: variable 'type' from source: play vars 7530 1727096036.70856: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7530 1727096036.70859: when evaluation is False, skipping this task 7530 1727096036.70862: _execute() done 7530 1727096036.70865: dumping result to json 7530 1727096036.70870: done dumping result, returning 7530 1727096036.70877: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-086b-f4f0-000000000e00] 7530 1727096036.70882: sending task result for task 0afff68d-5257-086b-f4f0-000000000e00 7530 1727096036.70973: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e00 7530 1727096036.70976: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7530 1727096036.71028: no more pending results, returning what we have 7530 1727096036.71032: results queue empty 7530 1727096036.71033: checking for any_errors_fatal 7530 1727096036.71039: done checking for any_errors_fatal 7530 1727096036.71040: checking for max_fail_percentage 7530 1727096036.71041: done checking for max_fail_percentage 7530 1727096036.71042: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.71043: done checking to see if all hosts have failed 7530 1727096036.71044: getting the remaining hosts for this loop 7530 1727096036.71045: done getting the remaining hosts for this loop 7530 1727096036.71049: getting the next task for host managed_node3 7530 1727096036.71056: done getting next task for host managed_node3 7530 1727096036.71058: ^ task is: TASK: Include the task 'show_interfaces.yml' 7530 1727096036.71061: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.71066: getting variables 7530 1727096036.71069: in VariableManager get_vars() 7530 1727096036.71120: Calling all_inventory to load vars for managed_node3 7530 1727096036.71122: Calling groups_inventory to load vars for managed_node3 7530 1727096036.71125: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.71139: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.71142: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.71144: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.72076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.72945: done with get_vars() 7530 1727096036.72971: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 08:53:56 -0400 (0:00:00.032) 0:00:27.518 ****** 7530 1727096036.73049: entering _queue_task() for managed_node3/include_tasks 7530 1727096036.73310: worker is 1 (out of 1 available) 7530 1727096036.73324: exiting _queue_task() for managed_node3/include_tasks 7530 1727096036.73339: done queuing things up, now waiting for results queue to drain 7530 1727096036.73340: waiting for pending results... 7530 1727096036.73529: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7530 1727096036.73605: in run() - task 0afff68d-5257-086b-f4f0-000000000e01 7530 1727096036.73616: variable 'ansible_search_path' from source: unknown 7530 1727096036.73620: variable 'ansible_search_path' from source: unknown 7530 1727096036.73650: calling self._execute() 7530 1727096036.73777: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.73781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.73786: variable 'omit' from source: magic vars 7530 1727096036.74029: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.74037: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.74043: _execute() done 7530 1727096036.74046: dumping result to json 7530 1727096036.74049: done dumping result, returning 7530 1727096036.74056: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-086b-f4f0-000000000e01] 7530 1727096036.74062: sending task result for task 0afff68d-5257-086b-f4f0-000000000e01 7530 1727096036.74153: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e01 7530 1727096036.74156: WORKER PROCESS EXITING 7530 1727096036.74190: no more pending results, returning what we have 7530 1727096036.74194: in VariableManager get_vars() 7530 1727096036.74255: Calling all_inventory to load vars for managed_node3 7530 1727096036.74258: Calling groups_inventory to load vars for managed_node3 7530 1727096036.74260: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.74276: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.74279: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.74282: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.75087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.75963: done with get_vars() 7530 1727096036.75984: variable 'ansible_search_path' from source: unknown 7530 1727096036.75985: variable 'ansible_search_path' from source: unknown 7530 1727096036.76016: we have included files to process 7530 1727096036.76017: generating all_blocks data 7530 1727096036.76018: done generating all_blocks data 7530 1727096036.76022: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096036.76023: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096036.76025: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096036.76101: in VariableManager get_vars() 7530 1727096036.76125: done with get_vars() 7530 1727096036.76209: done processing included file 7530 1727096036.76210: iterating over new_blocks loaded from include file 7530 1727096036.76212: in VariableManager get_vars() 7530 1727096036.76231: done with get_vars() 7530 1727096036.76232: filtering new block on tags 7530 1727096036.76244: done filtering new block on tags 7530 1727096036.76246: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7530 1727096036.76250: extending task lists for all hosts with included blocks 7530 1727096036.76486: done extending task lists 7530 1727096036.76487: done processing included files 7530 1727096036.76488: results queue empty 7530 1727096036.76489: checking for any_errors_fatal 7530 1727096036.76491: done checking for any_errors_fatal 7530 1727096036.76491: checking for max_fail_percentage 7530 1727096036.76492: done checking for max_fail_percentage 7530 1727096036.76492: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.76493: done checking to see if all hosts have failed 7530 1727096036.76493: getting the remaining hosts for this loop 7530 1727096036.76494: done getting the remaining hosts for this loop 7530 1727096036.76496: getting the next task for host managed_node3 7530 1727096036.76499: done getting next task for host managed_node3 7530 1727096036.76500: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7530 1727096036.76502: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.76504: getting variables 7530 1727096036.76505: in VariableManager get_vars() 7530 1727096036.76517: Calling all_inventory to load vars for managed_node3 7530 1727096036.76519: Calling groups_inventory to load vars for managed_node3 7530 1727096036.76520: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.76524: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.76526: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.76530: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.77246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.78101: done with get_vars() 7530 1727096036.78122: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:53:56 -0400 (0:00:00.051) 0:00:27.570 ****** 7530 1727096036.78186: entering _queue_task() for managed_node3/include_tasks 7530 1727096036.78454: worker is 1 (out of 1 available) 7530 1727096036.78469: exiting _queue_task() for managed_node3/include_tasks 7530 1727096036.78482: done queuing things up, now waiting for results queue to drain 7530 1727096036.78483: waiting for pending results... 7530 1727096036.78664: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7530 1727096036.78738: in run() - task 0afff68d-5257-086b-f4f0-000000001030 7530 1727096036.78749: variable 'ansible_search_path' from source: unknown 7530 1727096036.78754: variable 'ansible_search_path' from source: unknown 7530 1727096036.78783: calling self._execute() 7530 1727096036.78863: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.78866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.78880: variable 'omit' from source: magic vars 7530 1727096036.79159: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.79171: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.79177: _execute() done 7530 1727096036.79180: dumping result to json 7530 1727096036.79183: done dumping result, returning 7530 1727096036.79190: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-086b-f4f0-000000001030] 7530 1727096036.79194: sending task result for task 0afff68d-5257-086b-f4f0-000000001030 7530 1727096036.79281: done sending task result for task 0afff68d-5257-086b-f4f0-000000001030 7530 1727096036.79284: WORKER PROCESS EXITING 7530 1727096036.79315: no more pending results, returning what we have 7530 1727096036.79319: in VariableManager get_vars() 7530 1727096036.79377: Calling all_inventory to load vars for managed_node3 7530 1727096036.79380: Calling groups_inventory to load vars for managed_node3 7530 1727096036.79382: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.79396: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.79399: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.79402: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.80206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.81202: done with get_vars() 7530 1727096036.81219: variable 'ansible_search_path' from source: unknown 7530 1727096036.81220: variable 'ansible_search_path' from source: unknown 7530 1727096036.81263: we have included files to process 7530 1727096036.81264: generating all_blocks data 7530 1727096036.81265: done generating all_blocks data 7530 1727096036.81266: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096036.81269: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096036.81270: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096036.81458: done processing included file 7530 1727096036.81459: iterating over new_blocks loaded from include file 7530 1727096036.81460: in VariableManager get_vars() 7530 1727096036.81480: done with get_vars() 7530 1727096036.81481: filtering new block on tags 7530 1727096036.81494: done filtering new block on tags 7530 1727096036.81495: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7530 1727096036.81499: extending task lists for all hosts with included blocks 7530 1727096036.81591: done extending task lists 7530 1727096036.81592: done processing included files 7530 1727096036.81593: results queue empty 7530 1727096036.81593: checking for any_errors_fatal 7530 1727096036.81596: done checking for any_errors_fatal 7530 1727096036.81596: checking for max_fail_percentage 7530 1727096036.81597: done checking for max_fail_percentage 7530 1727096036.81597: checking to see if all hosts have failed and the running result is not ok 7530 1727096036.81598: done checking to see if all hosts have failed 7530 1727096036.81598: getting the remaining hosts for this loop 7530 1727096036.81599: done getting the remaining hosts for this loop 7530 1727096036.81601: getting the next task for host managed_node3 7530 1727096036.81603: done getting next task for host managed_node3 7530 1727096036.81605: ^ task is: TASK: Gather current interface info 7530 1727096036.81608: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096036.81609: getting variables 7530 1727096036.81610: in VariableManager get_vars() 7530 1727096036.81621: Calling all_inventory to load vars for managed_node3 7530 1727096036.81623: Calling groups_inventory to load vars for managed_node3 7530 1727096036.81624: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096036.81631: Calling all_plugins_play to load vars for managed_node3 7530 1727096036.81632: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096036.81634: Calling groups_plugins_play to load vars for managed_node3 7530 1727096036.82287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096036.83142: done with get_vars() 7530 1727096036.83162: done getting variables 7530 1727096036.83197: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:53:56 -0400 (0:00:00.050) 0:00:27.620 ****** 7530 1727096036.83220: entering _queue_task() for managed_node3/command 7530 1727096036.83485: worker is 1 (out of 1 available) 7530 1727096036.83498: exiting _queue_task() for managed_node3/command 7530 1727096036.83510: done queuing things up, now waiting for results queue to drain 7530 1727096036.83512: waiting for pending results... 7530 1727096036.83689: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7530 1727096036.83769: in run() - task 0afff68d-5257-086b-f4f0-000000001067 7530 1727096036.83783: variable 'ansible_search_path' from source: unknown 7530 1727096036.83787: variable 'ansible_search_path' from source: unknown 7530 1727096036.83814: calling self._execute() 7530 1727096036.83893: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.83896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.83906: variable 'omit' from source: magic vars 7530 1727096036.84190: variable 'ansible_distribution_major_version' from source: facts 7530 1727096036.84201: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096036.84207: variable 'omit' from source: magic vars 7530 1727096036.84240: variable 'omit' from source: magic vars 7530 1727096036.84268: variable 'omit' from source: magic vars 7530 1727096036.84305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096036.84333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096036.84348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096036.84363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.84375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096036.84401: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096036.84405: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.84408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.84475: Set connection var ansible_pipelining to False 7530 1727096036.84480: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096036.84486: Set connection var ansible_timeout to 10 7530 1727096036.84493: Set connection var ansible_shell_executable to /bin/sh 7530 1727096036.84496: Set connection var ansible_shell_type to sh 7530 1727096036.84498: Set connection var ansible_connection to ssh 7530 1727096036.84520: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.84522: variable 'ansible_connection' from source: unknown 7530 1727096036.84526: variable 'ansible_module_compression' from source: unknown 7530 1727096036.84531: variable 'ansible_shell_type' from source: unknown 7530 1727096036.84534: variable 'ansible_shell_executable' from source: unknown 7530 1727096036.84537: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096036.84539: variable 'ansible_pipelining' from source: unknown 7530 1727096036.84542: variable 'ansible_timeout' from source: unknown 7530 1727096036.84544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096036.84647: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096036.84656: variable 'omit' from source: magic vars 7530 1727096036.84661: starting attempt loop 7530 1727096036.84664: running the handler 7530 1727096036.84681: _low_level_execute_command(): starting 7530 1727096036.84687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096036.85215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096036.85220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096036.85224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096036.85226: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096036.85228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.85277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.85297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.85303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.85334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.87034: stdout chunk (state=3): >>>/root <<< 7530 1727096036.87123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.87155: stderr chunk (state=3): >>><<< 7530 1727096036.87159: stdout chunk (state=3): >>><<< 7530 1727096036.87183: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096036.87196: _low_level_execute_command(): starting 7530 1727096036.87202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872 `" && echo ansible-tmp-1727096036.8718364-8564-142039326076872="` echo /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872 `" ) && sleep 0' 7530 1727096036.87675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.87678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.87690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096036.87692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096036.87694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.87742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.87750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.87753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.87788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.89778: stdout chunk (state=3): >>>ansible-tmp-1727096036.8718364-8564-142039326076872=/root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872 <<< 7530 1727096036.89875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.89905: stderr chunk (state=3): >>><<< 7530 1727096036.89908: stdout chunk (state=3): >>><<< 7530 1727096036.89925: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096036.8718364-8564-142039326076872=/root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096036.89954: variable 'ansible_module_compression' from source: unknown 7530 1727096036.90001: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096036.90033: variable 'ansible_facts' from source: unknown 7530 1727096036.90092: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/AnsiballZ_command.py 7530 1727096036.90205: Sending initial data 7530 1727096036.90209: Sent initial data (154 bytes) 7530 1727096036.90641: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.90656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.90674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.90730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.90734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.90740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.90781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.92464: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096036.92489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096036.92523: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp17ypva6i /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/AnsiballZ_command.py <<< 7530 1727096036.92529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/AnsiballZ_command.py" <<< 7530 1727096036.92554: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp17ypva6i" to remote "/root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/AnsiballZ_command.py" <<< 7530 1727096036.92559: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/AnsiballZ_command.py" <<< 7530 1727096036.93050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.93096: stderr chunk (state=3): >>><<< 7530 1727096036.93100: stdout chunk (state=3): >>><<< 7530 1727096036.93142: done transferring module to remote 7530 1727096036.93152: _low_level_execute_command(): starting 7530 1727096036.93157: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/ /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/AnsiballZ_command.py && sleep 0' 7530 1727096036.93618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096036.93621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096036.93624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.93626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096036.93632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096036.93634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.93674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.93699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.93704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.93725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096036.95562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096036.95588: stderr chunk (state=3): >>><<< 7530 1727096036.95592: stdout chunk (state=3): >>><<< 7530 1727096036.95605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096036.95608: _low_level_execute_command(): starting 7530 1727096036.95614: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/AnsiballZ_command.py && sleep 0' 7530 1727096036.96219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096036.96298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096036.96302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096036.96309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096036.96345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096037.12513: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:57.119302", "end": "2024-09-23 08:53:57.122637", "delta": "0:00:00.003335", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096037.14478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096037.14482: stdout chunk (state=3): >>><<< 7530 1727096037.14485: stderr chunk (state=3): >>><<< 7530 1727096037.14487: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:57.119302", "end": "2024-09-23 08:53:57.122637", "delta": "0:00:00.003335", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096037.14490: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096037.14493: _low_level_execute_command(): starting 7530 1727096037.14495: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096036.8718364-8564-142039326076872/ > /dev/null 2>&1 && sleep 0' 7530 1727096037.15151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096037.15158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096037.15172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096037.15186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096037.15198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096037.15209: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096037.15225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096037.15243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096037.15251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096037.15257: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096037.15265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096037.15277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096037.15289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096037.15296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096037.15327: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096037.15387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096037.15401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096037.15420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096037.15489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096037.17414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096037.17438: stdout chunk (state=3): >>><<< 7530 1727096037.17451: stderr chunk (state=3): >>><<< 7530 1727096037.17498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096037.17510: handler run complete 7530 1727096037.17618: Evaluated conditional (False): False 7530 1727096037.17621: attempt loop complete, returning result 7530 1727096037.17624: _execute() done 7530 1727096037.17626: dumping result to json 7530 1727096037.17628: done dumping result, returning 7530 1727096037.17630: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-086b-f4f0-000000001067] 7530 1727096037.17632: sending task result for task 0afff68d-5257-086b-f4f0-000000001067 ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003335", "end": "2024-09-23 08:53:57.122637", "rc": 0, "start": "2024-09-23 08:53:57.119302" } STDOUT: eth0 lo peerveth0 veth0 7530 1727096037.18072: no more pending results, returning what we have 7530 1727096037.18076: results queue empty 7530 1727096037.18077: checking for any_errors_fatal 7530 1727096037.18078: done checking for any_errors_fatal 7530 1727096037.18079: checking for max_fail_percentage 7530 1727096037.18082: done checking for max_fail_percentage 7530 1727096037.18083: checking to see if all hosts have failed and the running result is not ok 7530 1727096037.18084: done checking to see if all hosts have failed 7530 1727096037.18085: getting the remaining hosts for this loop 7530 1727096037.18087: done getting the remaining hosts for this loop 7530 1727096037.18090: getting the next task for host managed_node3 7530 1727096037.18104: done getting next task for host managed_node3 7530 1727096037.18107: ^ task is: TASK: Set current_interfaces 7530 1727096037.18113: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096037.18118: getting variables 7530 1727096037.18120: in VariableManager get_vars() 7530 1727096037.18189: Calling all_inventory to load vars for managed_node3 7530 1727096037.18192: Calling groups_inventory to load vars for managed_node3 7530 1727096037.18195: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096037.18202: done sending task result for task 0afff68d-5257-086b-f4f0-000000001067 7530 1727096037.18208: WORKER PROCESS EXITING 7530 1727096037.18222: Calling all_plugins_play to load vars for managed_node3 7530 1727096037.18225: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096037.18229: Calling groups_plugins_play to load vars for managed_node3 7530 1727096037.19969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096037.21701: done with get_vars() 7530 1727096037.21730: done getting variables 7530 1727096037.21800: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:53:57 -0400 (0:00:00.386) 0:00:28.006 ****** 7530 1727096037.21839: entering _queue_task() for managed_node3/set_fact 7530 1727096037.22433: worker is 1 (out of 1 available) 7530 1727096037.22447: exiting _queue_task() for managed_node3/set_fact 7530 1727096037.22460: done queuing things up, now waiting for results queue to drain 7530 1727096037.22461: waiting for pending results... 7530 1727096037.22688: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7530 1727096037.22805: in run() - task 0afff68d-5257-086b-f4f0-000000001068 7530 1727096037.22821: variable 'ansible_search_path' from source: unknown 7530 1727096037.22826: variable 'ansible_search_path' from source: unknown 7530 1727096037.22861: calling self._execute() 7530 1727096037.22952: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.22957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.22971: variable 'omit' from source: magic vars 7530 1727096037.23322: variable 'ansible_distribution_major_version' from source: facts 7530 1727096037.23337: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096037.23343: variable 'omit' from source: magic vars 7530 1727096037.23389: variable 'omit' from source: magic vars 7530 1727096037.23499: variable '_current_interfaces' from source: set_fact 7530 1727096037.23566: variable 'omit' from source: magic vars 7530 1727096037.23606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096037.23642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096037.23662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096037.23680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096037.23690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096037.23721: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096037.23724: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.23727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.23826: Set connection var ansible_pipelining to False 7530 1727096037.23835: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096037.23841: Set connection var ansible_timeout to 10 7530 1727096037.23851: Set connection var ansible_shell_executable to /bin/sh 7530 1727096037.23853: Set connection var ansible_shell_type to sh 7530 1727096037.23856: Set connection var ansible_connection to ssh 7530 1727096037.23882: variable 'ansible_shell_executable' from source: unknown 7530 1727096037.23886: variable 'ansible_connection' from source: unknown 7530 1727096037.23889: variable 'ansible_module_compression' from source: unknown 7530 1727096037.23891: variable 'ansible_shell_type' from source: unknown 7530 1727096037.23893: variable 'ansible_shell_executable' from source: unknown 7530 1727096037.23896: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.23898: variable 'ansible_pipelining' from source: unknown 7530 1727096037.23908: variable 'ansible_timeout' from source: unknown 7530 1727096037.23910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.24042: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096037.24127: variable 'omit' from source: magic vars 7530 1727096037.24131: starting attempt loop 7530 1727096037.24133: running the handler 7530 1727096037.24135: handler run complete 7530 1727096037.24137: attempt loop complete, returning result 7530 1727096037.24139: _execute() done 7530 1727096037.24141: dumping result to json 7530 1727096037.24142: done dumping result, returning 7530 1727096037.24144: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-086b-f4f0-000000001068] 7530 1727096037.24146: sending task result for task 0afff68d-5257-086b-f4f0-000000001068 7530 1727096037.24207: done sending task result for task 0afff68d-5257-086b-f4f0-000000001068 7530 1727096037.24209: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7530 1727096037.24289: no more pending results, returning what we have 7530 1727096037.24292: results queue empty 7530 1727096037.24293: checking for any_errors_fatal 7530 1727096037.24301: done checking for any_errors_fatal 7530 1727096037.24302: checking for max_fail_percentage 7530 1727096037.24303: done checking for max_fail_percentage 7530 1727096037.24304: checking to see if all hosts have failed and the running result is not ok 7530 1727096037.24305: done checking to see if all hosts have failed 7530 1727096037.24306: getting the remaining hosts for this loop 7530 1727096037.24307: done getting the remaining hosts for this loop 7530 1727096037.24311: getting the next task for host managed_node3 7530 1727096037.24318: done getting next task for host managed_node3 7530 1727096037.24321: ^ task is: TASK: Show current_interfaces 7530 1727096037.24324: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096037.24328: getting variables 7530 1727096037.24329: in VariableManager get_vars() 7530 1727096037.24482: Calling all_inventory to load vars for managed_node3 7530 1727096037.24485: Calling groups_inventory to load vars for managed_node3 7530 1727096037.24488: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096037.24498: Calling all_plugins_play to load vars for managed_node3 7530 1727096037.24501: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096037.24504: Calling groups_plugins_play to load vars for managed_node3 7530 1727096037.25787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096037.27324: done with get_vars() 7530 1727096037.27357: done getting variables 7530 1727096037.27425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:53:57 -0400 (0:00:00.056) 0:00:28.062 ****** 7530 1727096037.27464: entering _queue_task() for managed_node3/debug 7530 1727096037.27826: worker is 1 (out of 1 available) 7530 1727096037.27842: exiting _queue_task() for managed_node3/debug 7530 1727096037.27854: done queuing things up, now waiting for results queue to drain 7530 1727096037.27855: waiting for pending results... 7530 1727096037.28189: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7530 1727096037.28314: in run() - task 0afff68d-5257-086b-f4f0-000000001031 7530 1727096037.28318: variable 'ansible_search_path' from source: unknown 7530 1727096037.28321: variable 'ansible_search_path' from source: unknown 7530 1727096037.28335: calling self._execute() 7530 1727096037.28441: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.28445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.28473: variable 'omit' from source: magic vars 7530 1727096037.28858: variable 'ansible_distribution_major_version' from source: facts 7530 1727096037.28873: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096037.28877: variable 'omit' from source: magic vars 7530 1727096037.28973: variable 'omit' from source: magic vars 7530 1727096037.29024: variable 'current_interfaces' from source: set_fact 7530 1727096037.29053: variable 'omit' from source: magic vars 7530 1727096037.29093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096037.29127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096037.29149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096037.29175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096037.29186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096037.29273: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096037.29277: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.29279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.29306: Set connection var ansible_pipelining to False 7530 1727096037.29311: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096037.29317: Set connection var ansible_timeout to 10 7530 1727096037.29327: Set connection var ansible_shell_executable to /bin/sh 7530 1727096037.29335: Set connection var ansible_shell_type to sh 7530 1727096037.29338: Set connection var ansible_connection to ssh 7530 1727096037.29361: variable 'ansible_shell_executable' from source: unknown 7530 1727096037.29365: variable 'ansible_connection' from source: unknown 7530 1727096037.29371: variable 'ansible_module_compression' from source: unknown 7530 1727096037.29373: variable 'ansible_shell_type' from source: unknown 7530 1727096037.29376: variable 'ansible_shell_executable' from source: unknown 7530 1727096037.29378: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.29380: variable 'ansible_pipelining' from source: unknown 7530 1727096037.29382: variable 'ansible_timeout' from source: unknown 7530 1727096037.29384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.29517: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096037.29672: variable 'omit' from source: magic vars 7530 1727096037.29674: starting attempt loop 7530 1727096037.29676: running the handler 7530 1727096037.29678: handler run complete 7530 1727096037.29679: attempt loop complete, returning result 7530 1727096037.29681: _execute() done 7530 1727096037.29682: dumping result to json 7530 1727096037.29684: done dumping result, returning 7530 1727096037.29686: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-086b-f4f0-000000001031] 7530 1727096037.29687: sending task result for task 0afff68d-5257-086b-f4f0-000000001031 7530 1727096037.29747: done sending task result for task 0afff68d-5257-086b-f4f0-000000001031 7530 1727096037.29749: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7530 1727096037.29811: no more pending results, returning what we have 7530 1727096037.29814: results queue empty 7530 1727096037.29815: checking for any_errors_fatal 7530 1727096037.29822: done checking for any_errors_fatal 7530 1727096037.29822: checking for max_fail_percentage 7530 1727096037.29824: done checking for max_fail_percentage 7530 1727096037.29825: checking to see if all hosts have failed and the running result is not ok 7530 1727096037.29826: done checking to see if all hosts have failed 7530 1727096037.29826: getting the remaining hosts for this loop 7530 1727096037.29830: done getting the remaining hosts for this loop 7530 1727096037.29834: getting the next task for host managed_node3 7530 1727096037.29842: done getting next task for host managed_node3 7530 1727096037.29844: ^ task is: TASK: Install iproute 7530 1727096037.29847: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096037.29852: getting variables 7530 1727096037.29853: in VariableManager get_vars() 7530 1727096037.29907: Calling all_inventory to load vars for managed_node3 7530 1727096037.29909: Calling groups_inventory to load vars for managed_node3 7530 1727096037.29912: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096037.29923: Calling all_plugins_play to load vars for managed_node3 7530 1727096037.29930: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096037.29934: Calling groups_plugins_play to load vars for managed_node3 7530 1727096037.31614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096037.33120: done with get_vars() 7530 1727096037.33152: done getting variables 7530 1727096037.33214: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 08:53:57 -0400 (0:00:00.057) 0:00:28.120 ****** 7530 1727096037.33249: entering _queue_task() for managed_node3/package 7530 1727096037.33597: worker is 1 (out of 1 available) 7530 1727096037.33611: exiting _queue_task() for managed_node3/package 7530 1727096037.33623: done queuing things up, now waiting for results queue to drain 7530 1727096037.33625: waiting for pending results... 7530 1727096037.33992: running TaskExecutor() for managed_node3/TASK: Install iproute 7530 1727096037.34042: in run() - task 0afff68d-5257-086b-f4f0-000000000e02 7530 1727096037.34055: variable 'ansible_search_path' from source: unknown 7530 1727096037.34059: variable 'ansible_search_path' from source: unknown 7530 1727096037.34174: calling self._execute() 7530 1727096037.34198: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.34205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.34215: variable 'omit' from source: magic vars 7530 1727096037.34610: variable 'ansible_distribution_major_version' from source: facts 7530 1727096037.34625: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096037.34628: variable 'omit' from source: magic vars 7530 1727096037.34674: variable 'omit' from source: magic vars 7530 1727096037.34873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096037.37018: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096037.37082: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096037.37203: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096037.37207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096037.37210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096037.37290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096037.37341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096037.37363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096037.37405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096037.37419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096037.37530: variable '__network_is_ostree' from source: set_fact 7530 1727096037.37538: variable 'omit' from source: magic vars 7530 1727096037.37580: variable 'omit' from source: magic vars 7530 1727096037.37610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096037.37645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096037.37659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096037.37677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096037.37775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096037.37778: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096037.37781: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.37784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.37824: Set connection var ansible_pipelining to False 7530 1727096037.37834: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096037.37840: Set connection var ansible_timeout to 10 7530 1727096037.37851: Set connection var ansible_shell_executable to /bin/sh 7530 1727096037.37854: Set connection var ansible_shell_type to sh 7530 1727096037.37857: Set connection var ansible_connection to ssh 7530 1727096037.37887: variable 'ansible_shell_executable' from source: unknown 7530 1727096037.37890: variable 'ansible_connection' from source: unknown 7530 1727096037.37897: variable 'ansible_module_compression' from source: unknown 7530 1727096037.37900: variable 'ansible_shell_type' from source: unknown 7530 1727096037.37902: variable 'ansible_shell_executable' from source: unknown 7530 1727096037.37905: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096037.37907: variable 'ansible_pipelining' from source: unknown 7530 1727096037.37909: variable 'ansible_timeout' from source: unknown 7530 1727096037.37911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096037.38073: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096037.38077: variable 'omit' from source: magic vars 7530 1727096037.38079: starting attempt loop 7530 1727096037.38082: running the handler 7530 1727096037.38084: variable 'ansible_facts' from source: unknown 7530 1727096037.38086: variable 'ansible_facts' from source: unknown 7530 1727096037.38089: _low_level_execute_command(): starting 7530 1727096037.38092: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096037.38815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096037.38827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096037.38842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096037.38862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096037.38881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096037.38974: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096037.39030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096037.39061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096037.40741: stdout chunk (state=3): >>>/root <<< 7530 1727096037.40892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096037.40896: stdout chunk (state=3): >>><<< 7530 1727096037.40899: stderr chunk (state=3): >>><<< 7530 1727096037.41021: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096037.41033: _low_level_execute_command(): starting 7530 1727096037.41037: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227 `" && echo ansible-tmp-1727096037.4092793-8585-249274982927227="` echo /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227 `" ) && sleep 0' 7530 1727096037.41719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096037.41814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096037.41831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096037.41863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096037.41885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096037.41908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096037.41996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096037.43950: stdout chunk (state=3): >>>ansible-tmp-1727096037.4092793-8585-249274982927227=/root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227 <<< 7530 1727096037.44092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096037.44095: stdout chunk (state=3): >>><<< 7530 1727096037.44103: stderr chunk (state=3): >>><<< 7530 1727096037.44122: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096037.4092793-8585-249274982927227=/root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096037.44158: variable 'ansible_module_compression' from source: unknown 7530 1727096037.44281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7530 1727096037.44284: variable 'ansible_facts' from source: unknown 7530 1727096037.44387: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/AnsiballZ_dnf.py 7530 1727096037.44620: Sending initial data 7530 1727096037.44623: Sent initial data (150 bytes) 7530 1727096037.45131: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096037.45143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096037.45152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096037.45183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096037.45193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096037.45267: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096037.45285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096037.45391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096037.47014: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7530 1727096037.47039: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096037.47099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096037.47165: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpahn9f4if /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/AnsiballZ_dnf.py <<< 7530 1727096037.47181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/AnsiballZ_dnf.py" <<< 7530 1727096037.47204: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpahn9f4if" to remote "/root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/AnsiballZ_dnf.py" <<< 7530 1727096037.48179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096037.48191: stdout chunk (state=3): >>><<< 7530 1727096037.48220: stderr chunk (state=3): >>><<< 7530 1727096037.48288: done transferring module to remote 7530 1727096037.48305: _low_level_execute_command(): starting 7530 1727096037.48326: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/ /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/AnsiballZ_dnf.py && sleep 0' 7530 1727096037.49007: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096037.49021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096037.49088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096037.49146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096037.49172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096037.49195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096037.49265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096037.51228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096037.51233: stdout chunk (state=3): >>><<< 7530 1727096037.51235: stderr chunk (state=3): >>><<< 7530 1727096037.51254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096037.51341: _low_level_execute_command(): starting 7530 1727096037.51345: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/AnsiballZ_dnf.py && sleep 0' 7530 1727096037.51873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096037.51888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096037.51901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096037.51918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096037.51937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096037.51948: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096037.51960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096037.51980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096037.51992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096037.52091: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096037.52116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096037.52133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096037.52222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096037.96955: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7530 1727096038.02062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096038.02086: stdout chunk (state=3): >>><<< 7530 1727096038.02100: stderr chunk (state=3): >>><<< 7530 1727096038.02252: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096038.02261: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096038.02264: _low_level_execute_command(): starting 7530 1727096038.02266: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096037.4092793-8585-249274982927227/ > /dev/null 2>&1 && sleep 0' 7530 1727096038.02829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096038.02843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096038.02880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096038.02892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096038.02935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.02987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096038.03005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096038.03055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096038.03094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096038.05023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096038.05058: stderr chunk (state=3): >>><<< 7530 1727096038.05062: stdout chunk (state=3): >>><<< 7530 1727096038.05081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096038.05274: handler run complete 7530 1727096038.05277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096038.05476: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096038.05530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096038.05566: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096038.05629: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096038.05705: variable '__install_status' from source: set_fact 7530 1727096038.05740: Evaluated conditional (__install_status is success): True 7530 1727096038.05764: attempt loop complete, returning result 7530 1727096038.05775: _execute() done 7530 1727096038.05782: dumping result to json 7530 1727096038.05792: done dumping result, returning 7530 1727096038.05804: done running TaskExecutor() for managed_node3/TASK: Install iproute [0afff68d-5257-086b-f4f0-000000000e02] 7530 1727096038.05814: sending task result for task 0afff68d-5257-086b-f4f0-000000000e02 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7530 1727096038.06145: no more pending results, returning what we have 7530 1727096038.06157: results queue empty 7530 1727096038.06159: checking for any_errors_fatal 7530 1727096038.06166: done checking for any_errors_fatal 7530 1727096038.06167: checking for max_fail_percentage 7530 1727096038.06171: done checking for max_fail_percentage 7530 1727096038.06172: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.06173: done checking to see if all hosts have failed 7530 1727096038.06174: getting the remaining hosts for this loop 7530 1727096038.06176: done getting the remaining hosts for this loop 7530 1727096038.06180: getting the next task for host managed_node3 7530 1727096038.06188: done getting next task for host managed_node3 7530 1727096038.06191: ^ task is: TASK: Create veth interface {{ interface }} 7530 1727096038.06194: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.06198: getting variables 7530 1727096038.06200: in VariableManager get_vars() 7530 1727096038.06253: Calling all_inventory to load vars for managed_node3 7530 1727096038.06256: Calling groups_inventory to load vars for managed_node3 7530 1727096038.06379: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.06392: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.06396: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.06400: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.06997: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e02 7530 1727096038.07001: WORKER PROCESS EXITING 7530 1727096038.07988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.11171: done with get_vars() 7530 1727096038.11271: done getting variables 7530 1727096038.11379: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096038.11627: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 08:53:58 -0400 (0:00:00.786) 0:00:28.906 ****** 7530 1727096038.11860: entering _queue_task() for managed_node3/command 7530 1727096038.12350: worker is 1 (out of 1 available) 7530 1727096038.12363: exiting _queue_task() for managed_node3/command 7530 1727096038.12402: done queuing things up, now waiting for results queue to drain 7530 1727096038.12405: waiting for pending results... 7530 1727096038.12774: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7530 1727096038.12893: in run() - task 0afff68d-5257-086b-f4f0-000000000e03 7530 1727096038.12916: variable 'ansible_search_path' from source: unknown 7530 1727096038.12926: variable 'ansible_search_path' from source: unknown 7530 1727096038.13242: variable 'interface' from source: play vars 7530 1727096038.13338: variable 'interface' from source: play vars 7530 1727096038.13579: variable 'interface' from source: play vars 7530 1727096038.13613: Loaded config def from plugin (lookup/items) 7530 1727096038.13617: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7530 1727096038.13648: variable 'omit' from source: magic vars 7530 1727096038.13808: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.13817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.13826: variable 'omit' from source: magic vars 7530 1727096038.14114: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.14131: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.14599: variable 'type' from source: play vars 7530 1727096038.14605: variable 'state' from source: include params 7530 1727096038.14610: variable 'interface' from source: play vars 7530 1727096038.14613: variable 'current_interfaces' from source: set_fact 7530 1727096038.14624: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7530 1727096038.14627: when evaluation is False, skipping this task 7530 1727096038.14666: variable 'item' from source: unknown 7530 1727096038.14748: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7530 1727096038.15229: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.15233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.15236: variable 'omit' from source: magic vars 7530 1727096038.15239: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.15242: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.15651: variable 'type' from source: play vars 7530 1727096038.15655: variable 'state' from source: include params 7530 1727096038.15658: variable 'interface' from source: play vars 7530 1727096038.15660: variable 'current_interfaces' from source: set_fact 7530 1727096038.15662: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7530 1727096038.15664: when evaluation is False, skipping this task 7530 1727096038.15666: variable 'item' from source: unknown 7530 1727096038.15671: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7530 1727096038.15944: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.15949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.15952: variable 'omit' from source: magic vars 7530 1727096038.15955: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.15957: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.15959: variable 'type' from source: play vars 7530 1727096038.15961: variable 'state' from source: include params 7530 1727096038.15963: variable 'interface' from source: play vars 7530 1727096038.15965: variable 'current_interfaces' from source: set_fact 7530 1727096038.15967: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7530 1727096038.15973: when evaluation is False, skipping this task 7530 1727096038.15988: variable 'item' from source: unknown 7530 1727096038.16052: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7530 1727096038.16129: dumping result to json 7530 1727096038.16133: done dumping result, returning 7530 1727096038.16135: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0afff68d-5257-086b-f4f0-000000000e03] 7530 1727096038.16138: sending task result for task 0afff68d-5257-086b-f4f0-000000000e03 skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7530 1727096038.16217: no more pending results, returning what we have 7530 1727096038.16221: results queue empty 7530 1727096038.16222: checking for any_errors_fatal 7530 1727096038.16233: done checking for any_errors_fatal 7530 1727096038.16234: checking for max_fail_percentage 7530 1727096038.16236: done checking for max_fail_percentage 7530 1727096038.16237: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.16238: done checking to see if all hosts have failed 7530 1727096038.16239: getting the remaining hosts for this loop 7530 1727096038.16240: done getting the remaining hosts for this loop 7530 1727096038.16244: getting the next task for host managed_node3 7530 1727096038.16250: done getting next task for host managed_node3 7530 1727096038.16253: ^ task is: TASK: Set up veth as managed by NetworkManager 7530 1727096038.16257: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.16261: getting variables 7530 1727096038.16326: in VariableManager get_vars() 7530 1727096038.16431: Calling all_inventory to load vars for managed_node3 7530 1727096038.16434: Calling groups_inventory to load vars for managed_node3 7530 1727096038.16436: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.16470: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e03 7530 1727096038.16475: WORKER PROCESS EXITING 7530 1727096038.16491: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.16494: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.16497: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.17436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.18302: done with get_vars() 7530 1727096038.18326: done getting variables 7530 1727096038.18377: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 08:53:58 -0400 (0:00:00.065) 0:00:28.972 ****** 7530 1727096038.18405: entering _queue_task() for managed_node3/command 7530 1727096038.18775: worker is 1 (out of 1 available) 7530 1727096038.18789: exiting _queue_task() for managed_node3/command 7530 1727096038.18803: done queuing things up, now waiting for results queue to drain 7530 1727096038.18805: waiting for pending results... 7530 1727096038.19052: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7530 1727096038.19240: in run() - task 0afff68d-5257-086b-f4f0-000000000e04 7530 1727096038.19245: variable 'ansible_search_path' from source: unknown 7530 1727096038.19248: variable 'ansible_search_path' from source: unknown 7530 1727096038.19251: calling self._execute() 7530 1727096038.19338: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.19453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.19458: variable 'omit' from source: magic vars 7530 1727096038.19731: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.19874: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.19918: variable 'type' from source: play vars 7530 1727096038.19928: variable 'state' from source: include params 7530 1727096038.19938: Evaluated conditional (type == 'veth' and state == 'present'): False 7530 1727096038.19945: when evaluation is False, skipping this task 7530 1727096038.19952: _execute() done 7530 1727096038.19959: dumping result to json 7530 1727096038.19966: done dumping result, returning 7530 1727096038.19980: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-086b-f4f0-000000000e04] 7530 1727096038.19995: sending task result for task 0afff68d-5257-086b-f4f0-000000000e04 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7530 1727096038.20223: no more pending results, returning what we have 7530 1727096038.20228: results queue empty 7530 1727096038.20230: checking for any_errors_fatal 7530 1727096038.20245: done checking for any_errors_fatal 7530 1727096038.20246: checking for max_fail_percentage 7530 1727096038.20248: done checking for max_fail_percentage 7530 1727096038.20249: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.20251: done checking to see if all hosts have failed 7530 1727096038.20251: getting the remaining hosts for this loop 7530 1727096038.20253: done getting the remaining hosts for this loop 7530 1727096038.20257: getting the next task for host managed_node3 7530 1727096038.20269: done getting next task for host managed_node3 7530 1727096038.20272: ^ task is: TASK: Delete veth interface {{ interface }} 7530 1727096038.20276: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.20280: getting variables 7530 1727096038.20282: in VariableManager get_vars() 7530 1727096038.20343: Calling all_inventory to load vars for managed_node3 7530 1727096038.20347: Calling groups_inventory to load vars for managed_node3 7530 1727096038.20350: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.20363: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.20366: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.20371: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.20381: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e04 7530 1727096038.20384: WORKER PROCESS EXITING 7530 1727096038.21230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.22288: done with get_vars() 7530 1727096038.22318: done getting variables 7530 1727096038.22385: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096038.22506: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 08:53:58 -0400 (0:00:00.041) 0:00:29.013 ****** 7530 1727096038.22537: entering _queue_task() for managed_node3/command 7530 1727096038.23001: worker is 1 (out of 1 available) 7530 1727096038.23018: exiting _queue_task() for managed_node3/command 7530 1727096038.23031: done queuing things up, now waiting for results queue to drain 7530 1727096038.23032: waiting for pending results... 7530 1727096038.23246: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7530 1727096038.23361: in run() - task 0afff68d-5257-086b-f4f0-000000000e05 7530 1727096038.23391: variable 'ansible_search_path' from source: unknown 7530 1727096038.23400: variable 'ansible_search_path' from source: unknown 7530 1727096038.23436: calling self._execute() 7530 1727096038.23620: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.23623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.23628: variable 'omit' from source: magic vars 7530 1727096038.23965: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.23971: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.24180: variable 'type' from source: play vars 7530 1727096038.24183: variable 'state' from source: include params 7530 1727096038.24186: variable 'interface' from source: play vars 7530 1727096038.24189: variable 'current_interfaces' from source: set_fact 7530 1727096038.24192: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7530 1727096038.24194: variable 'omit' from source: magic vars 7530 1727096038.24197: variable 'omit' from source: magic vars 7530 1727096038.24263: variable 'interface' from source: play vars 7530 1727096038.24474: variable 'omit' from source: magic vars 7530 1727096038.24477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096038.24480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096038.24482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096038.24484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096038.24486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096038.24488: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096038.24491: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.24493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.24587: Set connection var ansible_pipelining to False 7530 1727096038.24600: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096038.24613: Set connection var ansible_timeout to 10 7530 1727096038.24629: Set connection var ansible_shell_executable to /bin/sh 7530 1727096038.24637: Set connection var ansible_shell_type to sh 7530 1727096038.24643: Set connection var ansible_connection to ssh 7530 1727096038.24674: variable 'ansible_shell_executable' from source: unknown 7530 1727096038.24682: variable 'ansible_connection' from source: unknown 7530 1727096038.24688: variable 'ansible_module_compression' from source: unknown 7530 1727096038.24695: variable 'ansible_shell_type' from source: unknown 7530 1727096038.24701: variable 'ansible_shell_executable' from source: unknown 7530 1727096038.24707: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.24715: variable 'ansible_pipelining' from source: unknown 7530 1727096038.24722: variable 'ansible_timeout' from source: unknown 7530 1727096038.24732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.24883: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096038.24899: variable 'omit' from source: magic vars 7530 1727096038.24908: starting attempt loop 7530 1727096038.24915: running the handler 7530 1727096038.24940: _low_level_execute_command(): starting 7530 1727096038.24954: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096038.25511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096038.25520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.25548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096038.25551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.25601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096038.25605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096038.25619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096038.25660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096038.27394: stdout chunk (state=3): >>>/root <<< 7530 1727096038.27553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096038.27557: stdout chunk (state=3): >>><<< 7530 1727096038.27559: stderr chunk (state=3): >>><<< 7530 1727096038.27588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096038.27671: _low_level_execute_command(): starting 7530 1727096038.27675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100 `" && echo ansible-tmp-1727096038.2759614-8615-263066310107100="` echo /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100 `" ) && sleep 0' 7530 1727096038.28132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096038.28145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.28207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096038.28211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096038.28215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096038.28253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096038.30272: stdout chunk (state=3): >>>ansible-tmp-1727096038.2759614-8615-263066310107100=/root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100 <<< 7530 1727096038.30374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096038.30406: stderr chunk (state=3): >>><<< 7530 1727096038.30409: stdout chunk (state=3): >>><<< 7530 1727096038.30426: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096038.2759614-8615-263066310107100=/root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096038.30456: variable 'ansible_module_compression' from source: unknown 7530 1727096038.30504: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096038.30533: variable 'ansible_facts' from source: unknown 7530 1727096038.30591: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/AnsiballZ_command.py 7530 1727096038.30703: Sending initial data 7530 1727096038.30706: Sent initial data (154 bytes) 7530 1727096038.31182: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096038.31186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096038.31188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.31190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096038.31192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096038.31195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.31239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096038.31243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096038.31254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096038.31297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096038.32977: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096038.32998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096038.33038: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp6dx189ag /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/AnsiballZ_command.py <<< 7530 1727096038.33040: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/AnsiballZ_command.py" <<< 7530 1727096038.33071: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp6dx189ag" to remote "/root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/AnsiballZ_command.py" <<< 7530 1727096038.33078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/AnsiballZ_command.py" <<< 7530 1727096038.33593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096038.33635: stderr chunk (state=3): >>><<< 7530 1727096038.33638: stdout chunk (state=3): >>><<< 7530 1727096038.33662: done transferring module to remote 7530 1727096038.33673: _low_level_execute_command(): starting 7530 1727096038.33683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/ /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/AnsiballZ_command.py && sleep 0' 7530 1727096038.34136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096038.34141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.34162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096038.34165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.34230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096038.34233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096038.34235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096038.34282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096038.36181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096038.36185: stdout chunk (state=3): >>><<< 7530 1727096038.36188: stderr chunk (state=3): >>><<< 7530 1727096038.36207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096038.36210: _low_level_execute_command(): starting 7530 1727096038.36215: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/AnsiballZ_command.py && sleep 0' 7530 1727096038.36682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096038.36686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.36688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096038.36691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.36750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096038.36757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096038.36760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096038.36798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096038.53792: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-23 08:53:58.525297", "end": "2024-09-23 08:53:58.534738", "delta": "0:00:00.009441", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096038.56394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096038.56419: stderr chunk (state=3): >>><<< 7530 1727096038.56422: stdout chunk (state=3): >>><<< 7530 1727096038.56443: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-23 08:53:58.525297", "end": "2024-09-23 08:53:58.534738", "delta": "0:00:00.009441", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096038.56474: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096038.56480: _low_level_execute_command(): starting 7530 1727096038.56485: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096038.2759614-8615-263066310107100/ > /dev/null 2>&1 && sleep 0' 7530 1727096038.56927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096038.56960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096038.56963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.56965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096038.56978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096038.57023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096038.57026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096038.57031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096038.57072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096038.58937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096038.58965: stderr chunk (state=3): >>><<< 7530 1727096038.58970: stdout chunk (state=3): >>><<< 7530 1727096038.58985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096038.58996: handler run complete 7530 1727096038.59012: Evaluated conditional (False): False 7530 1727096038.59022: attempt loop complete, returning result 7530 1727096038.59025: _execute() done 7530 1727096038.59030: dumping result to json 7530 1727096038.59032: done dumping result, returning 7530 1727096038.59039: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0afff68d-5257-086b-f4f0-000000000e05] 7530 1727096038.59044: sending task result for task 0afff68d-5257-086b-f4f0-000000000e05 7530 1727096038.59146: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e05 7530 1727096038.59148: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.009441", "end": "2024-09-23 08:53:58.534738", "rc": 0, "start": "2024-09-23 08:53:58.525297" } 7530 1727096038.59209: no more pending results, returning what we have 7530 1727096038.59212: results queue empty 7530 1727096038.59213: checking for any_errors_fatal 7530 1727096038.59220: done checking for any_errors_fatal 7530 1727096038.59221: checking for max_fail_percentage 7530 1727096038.59222: done checking for max_fail_percentage 7530 1727096038.59223: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.59225: done checking to see if all hosts have failed 7530 1727096038.59225: getting the remaining hosts for this loop 7530 1727096038.59229: done getting the remaining hosts for this loop 7530 1727096038.59233: getting the next task for host managed_node3 7530 1727096038.59240: done getting next task for host managed_node3 7530 1727096038.59242: ^ task is: TASK: Create dummy interface {{ interface }} 7530 1727096038.59245: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.59249: getting variables 7530 1727096038.59250: in VariableManager get_vars() 7530 1727096038.59307: Calling all_inventory to load vars for managed_node3 7530 1727096038.59310: Calling groups_inventory to load vars for managed_node3 7530 1727096038.59312: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.59324: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.59327: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.59363: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.60712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.61581: done with get_vars() 7530 1727096038.61604: done getting variables 7530 1727096038.61650: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096038.61735: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 08:53:58 -0400 (0:00:00.392) 0:00:29.405 ****** 7530 1727096038.61759: entering _queue_task() for managed_node3/command 7530 1727096038.62018: worker is 1 (out of 1 available) 7530 1727096038.62031: exiting _queue_task() for managed_node3/command 7530 1727096038.62043: done queuing things up, now waiting for results queue to drain 7530 1727096038.62045: waiting for pending results... 7530 1727096038.62229: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7530 1727096038.62309: in run() - task 0afff68d-5257-086b-f4f0-000000000e06 7530 1727096038.62321: variable 'ansible_search_path' from source: unknown 7530 1727096038.62324: variable 'ansible_search_path' from source: unknown 7530 1727096038.62356: calling self._execute() 7530 1727096038.62439: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.62446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.62573: variable 'omit' from source: magic vars 7530 1727096038.62996: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.63018: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.63215: variable 'type' from source: play vars 7530 1727096038.63225: variable 'state' from source: include params 7530 1727096038.63236: variable 'interface' from source: play vars 7530 1727096038.63243: variable 'current_interfaces' from source: set_fact 7530 1727096038.63254: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7530 1727096038.63260: when evaluation is False, skipping this task 7530 1727096038.63266: _execute() done 7530 1727096038.63376: dumping result to json 7530 1727096038.63379: done dumping result, returning 7530 1727096038.63380: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0afff68d-5257-086b-f4f0-000000000e06] 7530 1727096038.63383: sending task result for task 0afff68d-5257-086b-f4f0-000000000e06 7530 1727096038.63443: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e06 7530 1727096038.63446: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096038.63495: no more pending results, returning what we have 7530 1727096038.63499: results queue empty 7530 1727096038.63500: checking for any_errors_fatal 7530 1727096038.63508: done checking for any_errors_fatal 7530 1727096038.63508: checking for max_fail_percentage 7530 1727096038.63510: done checking for max_fail_percentage 7530 1727096038.63510: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.63511: done checking to see if all hosts have failed 7530 1727096038.63512: getting the remaining hosts for this loop 7530 1727096038.63513: done getting the remaining hosts for this loop 7530 1727096038.63517: getting the next task for host managed_node3 7530 1727096038.63523: done getting next task for host managed_node3 7530 1727096038.63525: ^ task is: TASK: Delete dummy interface {{ interface }} 7530 1727096038.63528: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.63532: getting variables 7530 1727096038.63534: in VariableManager get_vars() 7530 1727096038.63578: Calling all_inventory to load vars for managed_node3 7530 1727096038.63580: Calling groups_inventory to load vars for managed_node3 7530 1727096038.63582: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.63593: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.63595: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.63598: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.65297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.66777: done with get_vars() 7530 1727096038.66817: done getting variables 7530 1727096038.66882: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096038.67009: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 08:53:58 -0400 (0:00:00.052) 0:00:29.458 ****** 7530 1727096038.67041: entering _queue_task() for managed_node3/command 7530 1727096038.67403: worker is 1 (out of 1 available) 7530 1727096038.67417: exiting _queue_task() for managed_node3/command 7530 1727096038.67431: done queuing things up, now waiting for results queue to drain 7530 1727096038.67432: waiting for pending results... 7530 1727096038.67747: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7530 1727096038.67886: in run() - task 0afff68d-5257-086b-f4f0-000000000e07 7530 1727096038.67908: variable 'ansible_search_path' from source: unknown 7530 1727096038.67915: variable 'ansible_search_path' from source: unknown 7530 1727096038.67955: calling self._execute() 7530 1727096038.68095: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.68101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.68104: variable 'omit' from source: magic vars 7530 1727096038.68519: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.68550: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.68860: variable 'type' from source: play vars 7530 1727096038.68863: variable 'state' from source: include params 7530 1727096038.68866: variable 'interface' from source: play vars 7530 1727096038.68870: variable 'current_interfaces' from source: set_fact 7530 1727096038.68872: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7530 1727096038.68874: when evaluation is False, skipping this task 7530 1727096038.68876: _execute() done 7530 1727096038.68878: dumping result to json 7530 1727096038.68880: done dumping result, returning 7530 1727096038.68882: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0afff68d-5257-086b-f4f0-000000000e07] 7530 1727096038.68884: sending task result for task 0afff68d-5257-086b-f4f0-000000000e07 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096038.69017: no more pending results, returning what we have 7530 1727096038.69022: results queue empty 7530 1727096038.69023: checking for any_errors_fatal 7530 1727096038.69032: done checking for any_errors_fatal 7530 1727096038.69033: checking for max_fail_percentage 7530 1727096038.69035: done checking for max_fail_percentage 7530 1727096038.69036: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.69037: done checking to see if all hosts have failed 7530 1727096038.69038: getting the remaining hosts for this loop 7530 1727096038.69039: done getting the remaining hosts for this loop 7530 1727096038.69043: getting the next task for host managed_node3 7530 1727096038.69052: done getting next task for host managed_node3 7530 1727096038.69055: ^ task is: TASK: Create tap interface {{ interface }} 7530 1727096038.69059: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.69065: getting variables 7530 1727096038.69073: in VariableManager get_vars() 7530 1727096038.69136: Calling all_inventory to load vars for managed_node3 7530 1727096038.69140: Calling groups_inventory to load vars for managed_node3 7530 1727096038.69143: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.69159: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.69163: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.69166: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.69983: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e07 7530 1727096038.69987: WORKER PROCESS EXITING 7530 1727096038.70839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.72450: done with get_vars() 7530 1727096038.72490: done getting variables 7530 1727096038.72562: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096038.72695: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 08:53:58 -0400 (0:00:00.056) 0:00:29.515 ****** 7530 1727096038.72728: entering _queue_task() for managed_node3/command 7530 1727096038.73211: worker is 1 (out of 1 available) 7530 1727096038.73223: exiting _queue_task() for managed_node3/command 7530 1727096038.73235: done queuing things up, now waiting for results queue to drain 7530 1727096038.73237: waiting for pending results... 7530 1727096038.73458: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7530 1727096038.73585: in run() - task 0afff68d-5257-086b-f4f0-000000000e08 7530 1727096038.73611: variable 'ansible_search_path' from source: unknown 7530 1727096038.73620: variable 'ansible_search_path' from source: unknown 7530 1727096038.73683: calling self._execute() 7530 1727096038.73778: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.73796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.73829: variable 'omit' from source: magic vars 7530 1727096038.74215: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.74266: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.74465: variable 'type' from source: play vars 7530 1727096038.74484: variable 'state' from source: include params 7530 1727096038.74496: variable 'interface' from source: play vars 7530 1727096038.74505: variable 'current_interfaces' from source: set_fact 7530 1727096038.74518: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7530 1727096038.74526: when evaluation is False, skipping this task 7530 1727096038.74533: _execute() done 7530 1727096038.74541: dumping result to json 7530 1727096038.74551: done dumping result, returning 7530 1727096038.74562: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0afff68d-5257-086b-f4f0-000000000e08] 7530 1727096038.74575: sending task result for task 0afff68d-5257-086b-f4f0-000000000e08 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096038.74848: no more pending results, returning what we have 7530 1727096038.74853: results queue empty 7530 1727096038.74854: checking for any_errors_fatal 7530 1727096038.74861: done checking for any_errors_fatal 7530 1727096038.74862: checking for max_fail_percentage 7530 1727096038.74864: done checking for max_fail_percentage 7530 1727096038.74865: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.74868: done checking to see if all hosts have failed 7530 1727096038.74870: getting the remaining hosts for this loop 7530 1727096038.74872: done getting the remaining hosts for this loop 7530 1727096038.74876: getting the next task for host managed_node3 7530 1727096038.74884: done getting next task for host managed_node3 7530 1727096038.74887: ^ task is: TASK: Delete tap interface {{ interface }} 7530 1727096038.74892: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.74896: getting variables 7530 1727096038.74899: in VariableManager get_vars() 7530 1727096038.75166: Calling all_inventory to load vars for managed_node3 7530 1727096038.75171: Calling groups_inventory to load vars for managed_node3 7530 1727096038.75174: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.75180: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e08 7530 1727096038.75183: WORKER PROCESS EXITING 7530 1727096038.75192: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.75195: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.75197: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.82257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.84197: done with get_vars() 7530 1727096038.84232: done getting variables 7530 1727096038.84301: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096038.84425: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 08:53:58 -0400 (0:00:00.117) 0:00:29.632 ****** 7530 1727096038.84450: entering _queue_task() for managed_node3/command 7530 1727096038.84937: worker is 1 (out of 1 available) 7530 1727096038.84950: exiting _queue_task() for managed_node3/command 7530 1727096038.84962: done queuing things up, now waiting for results queue to drain 7530 1727096038.84965: waiting for pending results... 7530 1727096038.85195: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7530 1727096038.85339: in run() - task 0afff68d-5257-086b-f4f0-000000000e09 7530 1727096038.85363: variable 'ansible_search_path' from source: unknown 7530 1727096038.85379: variable 'ansible_search_path' from source: unknown 7530 1727096038.85427: calling self._execute() 7530 1727096038.85546: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.85559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.85591: variable 'omit' from source: magic vars 7530 1727096038.86009: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.86076: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.86265: variable 'type' from source: play vars 7530 1727096038.86279: variable 'state' from source: include params 7530 1727096038.86293: variable 'interface' from source: play vars 7530 1727096038.86303: variable 'current_interfaces' from source: set_fact 7530 1727096038.86317: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7530 1727096038.86325: when evaluation is False, skipping this task 7530 1727096038.86353: _execute() done 7530 1727096038.86357: dumping result to json 7530 1727096038.86359: done dumping result, returning 7530 1727096038.86361: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0afff68d-5257-086b-f4f0-000000000e09] 7530 1727096038.86365: sending task result for task 0afff68d-5257-086b-f4f0-000000000e09 7530 1727096038.86541: done sending task result for task 0afff68d-5257-086b-f4f0-000000000e09 7530 1727096038.86544: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096038.86643: no more pending results, returning what we have 7530 1727096038.86647: results queue empty 7530 1727096038.86647: checking for any_errors_fatal 7530 1727096038.86655: done checking for any_errors_fatal 7530 1727096038.86656: checking for max_fail_percentage 7530 1727096038.86657: done checking for max_fail_percentage 7530 1727096038.86659: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.86660: done checking to see if all hosts have failed 7530 1727096038.86869: getting the remaining hosts for this loop 7530 1727096038.86872: done getting the remaining hosts for this loop 7530 1727096038.86876: getting the next task for host managed_node3 7530 1727096038.86883: done getting next task for host managed_node3 7530 1727096038.86886: ^ task is: TASK: TEST: I can configure an interface with auto_gateway disabled 7530 1727096038.86888: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.86892: getting variables 7530 1727096038.86894: in VariableManager get_vars() 7530 1727096038.86939: Calling all_inventory to load vars for managed_node3 7530 1727096038.86942: Calling groups_inventory to load vars for managed_node3 7530 1727096038.86944: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.86955: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.86958: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.86961: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.89674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096038.92833: done with get_vars() 7530 1727096038.92865: done getting variables 7530 1727096038.92926: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway disabled] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:83 Monday 23 September 2024 08:53:58 -0400 (0:00:00.085) 0:00:29.717 ****** 7530 1727096038.92956: entering _queue_task() for managed_node3/debug 7530 1727096038.93697: worker is 1 (out of 1 available) 7530 1727096038.93709: exiting _queue_task() for managed_node3/debug 7530 1727096038.93721: done queuing things up, now waiting for results queue to drain 7530 1727096038.93723: waiting for pending results... 7530 1727096038.94395: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled 7530 1727096038.94576: in run() - task 0afff68d-5257-086b-f4f0-0000000000af 7530 1727096038.94591: variable 'ansible_search_path' from source: unknown 7530 1727096038.94628: calling self._execute() 7530 1727096038.94773: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.94777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.94780: variable 'omit' from source: magic vars 7530 1727096038.95531: variable 'ansible_distribution_major_version' from source: facts 7530 1727096038.95551: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096038.95564: variable 'omit' from source: magic vars 7530 1727096038.95604: variable 'omit' from source: magic vars 7530 1727096038.95650: variable 'omit' from source: magic vars 7530 1727096038.95812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096038.95817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096038.95820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096038.95822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096038.95824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096038.95856: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096038.95865: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.95876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.95997: Set connection var ansible_pipelining to False 7530 1727096038.96009: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096038.96020: Set connection var ansible_timeout to 10 7530 1727096038.96047: Set connection var ansible_shell_executable to /bin/sh 7530 1727096038.96056: Set connection var ansible_shell_type to sh 7530 1727096038.96063: Set connection var ansible_connection to ssh 7530 1727096038.96140: variable 'ansible_shell_executable' from source: unknown 7530 1727096038.96143: variable 'ansible_connection' from source: unknown 7530 1727096038.96146: variable 'ansible_module_compression' from source: unknown 7530 1727096038.96149: variable 'ansible_shell_type' from source: unknown 7530 1727096038.96156: variable 'ansible_shell_executable' from source: unknown 7530 1727096038.96159: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096038.96161: variable 'ansible_pipelining' from source: unknown 7530 1727096038.96163: variable 'ansible_timeout' from source: unknown 7530 1727096038.96166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096038.96314: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096038.96357: variable 'omit' from source: magic vars 7530 1727096038.96361: starting attempt loop 7530 1727096038.96364: running the handler 7530 1727096038.96410: handler run complete 7530 1727096038.96433: attempt loop complete, returning result 7530 1727096038.96466: _execute() done 7530 1727096038.96471: dumping result to json 7530 1727096038.96473: done dumping result, returning 7530 1727096038.96475: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled [0afff68d-5257-086b-f4f0-0000000000af] 7530 1727096038.96482: sending task result for task 0afff68d-5257-086b-f4f0-0000000000af ok: [managed_node3] => {} MSG: ################################################## 7530 1727096038.96641: no more pending results, returning what we have 7530 1727096038.96645: results queue empty 7530 1727096038.96646: checking for any_errors_fatal 7530 1727096038.96651: done checking for any_errors_fatal 7530 1727096038.96652: checking for max_fail_percentage 7530 1727096038.96654: done checking for max_fail_percentage 7530 1727096038.96655: checking to see if all hosts have failed and the running result is not ok 7530 1727096038.96656: done checking to see if all hosts have failed 7530 1727096038.96657: getting the remaining hosts for this loop 7530 1727096038.96659: done getting the remaining hosts for this loop 7530 1727096038.96663: getting the next task for host managed_node3 7530 1727096038.96672: done getting next task for host managed_node3 7530 1727096038.96677: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7530 1727096038.96679: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096038.96683: getting variables 7530 1727096038.96685: in VariableManager get_vars() 7530 1727096038.96743: Calling all_inventory to load vars for managed_node3 7530 1727096038.96746: Calling groups_inventory to load vars for managed_node3 7530 1727096038.96749: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096038.96761: Calling all_plugins_play to load vars for managed_node3 7530 1727096038.96765: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096038.96988: Calling groups_plugins_play to load vars for managed_node3 7530 1727096038.97002: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000af 7530 1727096038.97007: WORKER PROCESS EXITING 7530 1727096038.98632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.00263: done with get_vars() 7530 1727096039.00294: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:87 Monday 23 September 2024 08:53:59 -0400 (0:00:00.074) 0:00:29.792 ****** 7530 1727096039.00401: entering _queue_task() for managed_node3/include_tasks 7530 1727096039.00759: worker is 1 (out of 1 available) 7530 1727096039.00776: exiting _queue_task() for managed_node3/include_tasks 7530 1727096039.00903: done queuing things up, now waiting for results queue to drain 7530 1727096039.00906: waiting for pending results... 7530 1727096039.01099: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7530 1727096039.01255: in run() - task 0afff68d-5257-086b-f4f0-0000000000b0 7530 1727096039.01281: variable 'ansible_search_path' from source: unknown 7530 1727096039.01317: calling self._execute() 7530 1727096039.01424: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.01436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.01462: variable 'omit' from source: magic vars 7530 1727096039.01971: variable 'ansible_distribution_major_version' from source: facts 7530 1727096039.01977: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096039.01980: _execute() done 7530 1727096039.01982: dumping result to json 7530 1727096039.01986: done dumping result, returning 7530 1727096039.01989: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-086b-f4f0-0000000000b0] 7530 1727096039.01991: sending task result for task 0afff68d-5257-086b-f4f0-0000000000b0 7530 1727096039.02075: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000b0 7530 1727096039.02078: WORKER PROCESS EXITING 7530 1727096039.02109: no more pending results, returning what we have 7530 1727096039.02114: in VariableManager get_vars() 7530 1727096039.02178: Calling all_inventory to load vars for managed_node3 7530 1727096039.02181: Calling groups_inventory to load vars for managed_node3 7530 1727096039.02184: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.02200: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.02204: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.02207: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.03929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.07282: done with get_vars() 7530 1727096039.07313: variable 'ansible_search_path' from source: unknown 7530 1727096039.07330: we have included files to process 7530 1727096039.07331: generating all_blocks data 7530 1727096039.07333: done generating all_blocks data 7530 1727096039.07339: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096039.07340: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096039.07342: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096039.08155: in VariableManager get_vars() 7530 1727096039.08188: done with get_vars() 7530 1727096039.09433: done processing included file 7530 1727096039.09435: iterating over new_blocks loaded from include file 7530 1727096039.09437: in VariableManager get_vars() 7530 1727096039.09469: done with get_vars() 7530 1727096039.09673: filtering new block on tags 7530 1727096039.09708: done filtering new block on tags 7530 1727096039.09711: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7530 1727096039.09718: extending task lists for all hosts with included blocks 7530 1727096039.17964: done extending task lists 7530 1727096039.17966: done processing included files 7530 1727096039.17969: results queue empty 7530 1727096039.17970: checking for any_errors_fatal 7530 1727096039.17973: done checking for any_errors_fatal 7530 1727096039.17973: checking for max_fail_percentage 7530 1727096039.17974: done checking for max_fail_percentage 7530 1727096039.17975: checking to see if all hosts have failed and the running result is not ok 7530 1727096039.17976: done checking to see if all hosts have failed 7530 1727096039.17977: getting the remaining hosts for this loop 7530 1727096039.17978: done getting the remaining hosts for this loop 7530 1727096039.17980: getting the next task for host managed_node3 7530 1727096039.17984: done getting next task for host managed_node3 7530 1727096039.17986: ^ task is: TASK: Ensure state in ["present", "absent"] 7530 1727096039.17988: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096039.17990: getting variables 7530 1727096039.17991: in VariableManager get_vars() 7530 1727096039.18012: Calling all_inventory to load vars for managed_node3 7530 1727096039.18014: Calling groups_inventory to load vars for managed_node3 7530 1727096039.18016: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.18022: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.18024: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.18026: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.20605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.24485: done with get_vars() 7530 1727096039.24626: done getting variables 7530 1727096039.24739: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 08:53:59 -0400 (0:00:00.243) 0:00:30.036 ****** 7530 1727096039.24772: entering _queue_task() for managed_node3/fail 7530 1727096039.25227: worker is 1 (out of 1 available) 7530 1727096039.25236: exiting _queue_task() for managed_node3/fail 7530 1727096039.25249: done queuing things up, now waiting for results queue to drain 7530 1727096039.25250: waiting for pending results... 7530 1727096039.25946: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7530 1727096039.25951: in run() - task 0afff68d-5257-086b-f4f0-0000000010aa 7530 1727096039.25954: variable 'ansible_search_path' from source: unknown 7530 1727096039.25956: variable 'ansible_search_path' from source: unknown 7530 1727096039.25960: calling self._execute() 7530 1727096039.26035: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.26052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.26070: variable 'omit' from source: magic vars 7530 1727096039.26488: variable 'ansible_distribution_major_version' from source: facts 7530 1727096039.26573: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096039.26661: variable 'state' from source: include params 7530 1727096039.26675: Evaluated conditional (state not in ["present", "absent"]): False 7530 1727096039.26682: when evaluation is False, skipping this task 7530 1727096039.26697: _execute() done 7530 1727096039.26704: dumping result to json 7530 1727096039.26711: done dumping result, returning 7530 1727096039.26722: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-086b-f4f0-0000000010aa] 7530 1727096039.26732: sending task result for task 0afff68d-5257-086b-f4f0-0000000010aa skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7530 1727096039.26972: no more pending results, returning what we have 7530 1727096039.26976: results queue empty 7530 1727096039.26977: checking for any_errors_fatal 7530 1727096039.26978: done checking for any_errors_fatal 7530 1727096039.26979: checking for max_fail_percentage 7530 1727096039.26981: done checking for max_fail_percentage 7530 1727096039.26982: checking to see if all hosts have failed and the running result is not ok 7530 1727096039.26983: done checking to see if all hosts have failed 7530 1727096039.26984: getting the remaining hosts for this loop 7530 1727096039.26985: done getting the remaining hosts for this loop 7530 1727096039.26989: getting the next task for host managed_node3 7530 1727096039.26996: done getting next task for host managed_node3 7530 1727096039.26999: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096039.27002: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096039.27007: getting variables 7530 1727096039.27009: in VariableManager get_vars() 7530 1727096039.27071: Calling all_inventory to load vars for managed_node3 7530 1727096039.27075: Calling groups_inventory to load vars for managed_node3 7530 1727096039.27078: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.27094: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.27098: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.27101: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.27644: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010aa 7530 1727096039.27648: WORKER PROCESS EXITING 7530 1727096039.29689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.33140: done with get_vars() 7530 1727096039.33177: done getting variables 7530 1727096039.33356: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 08:53:59 -0400 (0:00:00.086) 0:00:30.122 ****** 7530 1727096039.33391: entering _queue_task() for managed_node3/fail 7530 1727096039.34230: worker is 1 (out of 1 available) 7530 1727096039.34243: exiting _queue_task() for managed_node3/fail 7530 1727096039.34256: done queuing things up, now waiting for results queue to drain 7530 1727096039.34258: waiting for pending results... 7530 1727096039.34701: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096039.35016: in run() - task 0afff68d-5257-086b-f4f0-0000000010ab 7530 1727096039.35099: variable 'ansible_search_path' from source: unknown 7530 1727096039.35104: variable 'ansible_search_path' from source: unknown 7530 1727096039.35203: calling self._execute() 7530 1727096039.35487: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.35493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.35504: variable 'omit' from source: magic vars 7530 1727096039.36293: variable 'ansible_distribution_major_version' from source: facts 7530 1727096039.36303: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096039.36641: variable 'type' from source: play vars 7530 1727096039.36647: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7530 1727096039.36650: when evaluation is False, skipping this task 7530 1727096039.36653: _execute() done 7530 1727096039.36656: dumping result to json 7530 1727096039.36658: done dumping result, returning 7530 1727096039.36726: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-086b-f4f0-0000000010ab] 7530 1727096039.36835: sending task result for task 0afff68d-5257-086b-f4f0-0000000010ab 7530 1727096039.36902: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010ab 7530 1727096039.36905: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7530 1727096039.36989: no more pending results, returning what we have 7530 1727096039.36993: results queue empty 7530 1727096039.36994: checking for any_errors_fatal 7530 1727096039.37005: done checking for any_errors_fatal 7530 1727096039.37006: checking for max_fail_percentage 7530 1727096039.37007: done checking for max_fail_percentage 7530 1727096039.37008: checking to see if all hosts have failed and the running result is not ok 7530 1727096039.37010: done checking to see if all hosts have failed 7530 1727096039.37010: getting the remaining hosts for this loop 7530 1727096039.37012: done getting the remaining hosts for this loop 7530 1727096039.37015: getting the next task for host managed_node3 7530 1727096039.37022: done getting next task for host managed_node3 7530 1727096039.37024: ^ task is: TASK: Include the task 'show_interfaces.yml' 7530 1727096039.37027: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096039.37032: getting variables 7530 1727096039.37033: in VariableManager get_vars() 7530 1727096039.37209: Calling all_inventory to load vars for managed_node3 7530 1727096039.37212: Calling groups_inventory to load vars for managed_node3 7530 1727096039.37215: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.37228: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.37231: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.37234: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.39034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.41057: done with get_vars() 7530 1727096039.41090: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 08:53:59 -0400 (0:00:00.078) 0:00:30.200 ****** 7530 1727096039.41217: entering _queue_task() for managed_node3/include_tasks 7530 1727096039.41793: worker is 1 (out of 1 available) 7530 1727096039.41805: exiting _queue_task() for managed_node3/include_tasks 7530 1727096039.41817: done queuing things up, now waiting for results queue to drain 7530 1727096039.41819: waiting for pending results... 7530 1727096039.42088: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7530 1727096039.42094: in run() - task 0afff68d-5257-086b-f4f0-0000000010ac 7530 1727096039.42111: variable 'ansible_search_path' from source: unknown 7530 1727096039.42115: variable 'ansible_search_path' from source: unknown 7530 1727096039.42155: calling self._execute() 7530 1727096039.42275: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.42282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.42292: variable 'omit' from source: magic vars 7530 1727096039.42732: variable 'ansible_distribution_major_version' from source: facts 7530 1727096039.42748: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096039.42754: _execute() done 7530 1727096039.42757: dumping result to json 7530 1727096039.42759: done dumping result, returning 7530 1727096039.42769: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-086b-f4f0-0000000010ac] 7530 1727096039.42775: sending task result for task 0afff68d-5257-086b-f4f0-0000000010ac 7530 1727096039.42903: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010ac 7530 1727096039.42905: WORKER PROCESS EXITING 7530 1727096039.42947: no more pending results, returning what we have 7530 1727096039.42953: in VariableManager get_vars() 7530 1727096039.43019: Calling all_inventory to load vars for managed_node3 7530 1727096039.43023: Calling groups_inventory to load vars for managed_node3 7530 1727096039.43025: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.43044: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.43048: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.43052: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.45387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.47096: done with get_vars() 7530 1727096039.47130: variable 'ansible_search_path' from source: unknown 7530 1727096039.47131: variable 'ansible_search_path' from source: unknown 7530 1727096039.47173: we have included files to process 7530 1727096039.47174: generating all_blocks data 7530 1727096039.47176: done generating all_blocks data 7530 1727096039.47180: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096039.47181: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096039.47184: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096039.47298: in VariableManager get_vars() 7530 1727096039.47333: done with get_vars() 7530 1727096039.47460: done processing included file 7530 1727096039.47462: iterating over new_blocks loaded from include file 7530 1727096039.47463: in VariableManager get_vars() 7530 1727096039.47489: done with get_vars() 7530 1727096039.47490: filtering new block on tags 7530 1727096039.47509: done filtering new block on tags 7530 1727096039.47511: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7530 1727096039.47516: extending task lists for all hosts with included blocks 7530 1727096039.47982: done extending task lists 7530 1727096039.47983: done processing included files 7530 1727096039.47984: results queue empty 7530 1727096039.47985: checking for any_errors_fatal 7530 1727096039.47988: done checking for any_errors_fatal 7530 1727096039.47989: checking for max_fail_percentage 7530 1727096039.47990: done checking for max_fail_percentage 7530 1727096039.47990: checking to see if all hosts have failed and the running result is not ok 7530 1727096039.47991: done checking to see if all hosts have failed 7530 1727096039.47996: getting the remaining hosts for this loop 7530 1727096039.47997: done getting the remaining hosts for this loop 7530 1727096039.48000: getting the next task for host managed_node3 7530 1727096039.48004: done getting next task for host managed_node3 7530 1727096039.48007: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7530 1727096039.48010: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096039.48013: getting variables 7530 1727096039.48014: in VariableManager get_vars() 7530 1727096039.48037: Calling all_inventory to load vars for managed_node3 7530 1727096039.48040: Calling groups_inventory to load vars for managed_node3 7530 1727096039.48042: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.48048: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.48051: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.48053: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.49514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.51243: done with get_vars() 7530 1727096039.51280: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:53:59 -0400 (0:00:00.101) 0:00:30.302 ****** 7530 1727096039.51374: entering _queue_task() for managed_node3/include_tasks 7530 1727096039.51726: worker is 1 (out of 1 available) 7530 1727096039.51743: exiting _queue_task() for managed_node3/include_tasks 7530 1727096039.51873: done queuing things up, now waiting for results queue to drain 7530 1727096039.51878: waiting for pending results... 7530 1727096039.52248: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7530 1727096039.52254: in run() - task 0afff68d-5257-086b-f4f0-00000000130a 7530 1727096039.52258: variable 'ansible_search_path' from source: unknown 7530 1727096039.52261: variable 'ansible_search_path' from source: unknown 7530 1727096039.52264: calling self._execute() 7530 1727096039.52369: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.52375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.52483: variable 'omit' from source: magic vars 7530 1727096039.52789: variable 'ansible_distribution_major_version' from source: facts 7530 1727096039.52800: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096039.52807: _execute() done 7530 1727096039.52810: dumping result to json 7530 1727096039.52815: done dumping result, returning 7530 1727096039.52822: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-086b-f4f0-00000000130a] 7530 1727096039.52830: sending task result for task 0afff68d-5257-086b-f4f0-00000000130a 7530 1727096039.52925: done sending task result for task 0afff68d-5257-086b-f4f0-00000000130a 7530 1727096039.52930: WORKER PROCESS EXITING 7530 1727096039.52992: no more pending results, returning what we have 7530 1727096039.52998: in VariableManager get_vars() 7530 1727096039.53058: Calling all_inventory to load vars for managed_node3 7530 1727096039.53061: Calling groups_inventory to load vars for managed_node3 7530 1727096039.53064: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.53282: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.53286: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.53289: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.55128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.56901: done with get_vars() 7530 1727096039.56930: variable 'ansible_search_path' from source: unknown 7530 1727096039.56932: variable 'ansible_search_path' from source: unknown 7530 1727096039.57001: we have included files to process 7530 1727096039.57003: generating all_blocks data 7530 1727096039.57006: done generating all_blocks data 7530 1727096039.57007: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096039.57009: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096039.57011: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096039.57295: done processing included file 7530 1727096039.57297: iterating over new_blocks loaded from include file 7530 1727096039.57299: in VariableManager get_vars() 7530 1727096039.57327: done with get_vars() 7530 1727096039.57328: filtering new block on tags 7530 1727096039.57346: done filtering new block on tags 7530 1727096039.57348: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7530 1727096039.57353: extending task lists for all hosts with included blocks 7530 1727096039.57498: done extending task lists 7530 1727096039.57499: done processing included files 7530 1727096039.57503: results queue empty 7530 1727096039.57504: checking for any_errors_fatal 7530 1727096039.57508: done checking for any_errors_fatal 7530 1727096039.57509: checking for max_fail_percentage 7530 1727096039.57510: done checking for max_fail_percentage 7530 1727096039.57511: checking to see if all hosts have failed and the running result is not ok 7530 1727096039.57511: done checking to see if all hosts have failed 7530 1727096039.57512: getting the remaining hosts for this loop 7530 1727096039.57513: done getting the remaining hosts for this loop 7530 1727096039.57516: getting the next task for host managed_node3 7530 1727096039.57520: done getting next task for host managed_node3 7530 1727096039.57522: ^ task is: TASK: Gather current interface info 7530 1727096039.57525: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096039.57528: getting variables 7530 1727096039.57528: in VariableManager get_vars() 7530 1727096039.57545: Calling all_inventory to load vars for managed_node3 7530 1727096039.57547: Calling groups_inventory to load vars for managed_node3 7530 1727096039.57549: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.57554: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.57557: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.57559: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.58808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096039.60370: done with get_vars() 7530 1727096039.60403: done getting variables 7530 1727096039.60458: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:53:59 -0400 (0:00:00.091) 0:00:30.393 ****** 7530 1727096039.60494: entering _queue_task() for managed_node3/command 7530 1727096039.61073: worker is 1 (out of 1 available) 7530 1727096039.61084: exiting _queue_task() for managed_node3/command 7530 1727096039.61094: done queuing things up, now waiting for results queue to drain 7530 1727096039.61095: waiting for pending results... 7530 1727096039.61312: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7530 1727096039.61318: in run() - task 0afff68d-5257-086b-f4f0-000000001341 7530 1727096039.61343: variable 'ansible_search_path' from source: unknown 7530 1727096039.61347: variable 'ansible_search_path' from source: unknown 7530 1727096039.61411: calling self._execute() 7530 1727096039.61485: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.61490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.61499: variable 'omit' from source: magic vars 7530 1727096039.61794: variable 'ansible_distribution_major_version' from source: facts 7530 1727096039.61804: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096039.61811: variable 'omit' from source: magic vars 7530 1727096039.61849: variable 'omit' from source: magic vars 7530 1727096039.61878: variable 'omit' from source: magic vars 7530 1727096039.61913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096039.61940: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096039.61959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096039.61975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096039.61984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096039.62008: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096039.62011: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.62014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.62088: Set connection var ansible_pipelining to False 7530 1727096039.62094: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096039.62100: Set connection var ansible_timeout to 10 7530 1727096039.62107: Set connection var ansible_shell_executable to /bin/sh 7530 1727096039.62110: Set connection var ansible_shell_type to sh 7530 1727096039.62112: Set connection var ansible_connection to ssh 7530 1727096039.62133: variable 'ansible_shell_executable' from source: unknown 7530 1727096039.62136: variable 'ansible_connection' from source: unknown 7530 1727096039.62139: variable 'ansible_module_compression' from source: unknown 7530 1727096039.62142: variable 'ansible_shell_type' from source: unknown 7530 1727096039.62144: variable 'ansible_shell_executable' from source: unknown 7530 1727096039.62146: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096039.62148: variable 'ansible_pipelining' from source: unknown 7530 1727096039.62150: variable 'ansible_timeout' from source: unknown 7530 1727096039.62153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096039.62259: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096039.62271: variable 'omit' from source: magic vars 7530 1727096039.62277: starting attempt loop 7530 1727096039.62280: running the handler 7530 1727096039.62300: _low_level_execute_command(): starting 7530 1727096039.62303: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096039.62821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096039.62830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096039.62834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096039.62889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096039.62893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096039.62897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096039.62938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096039.64652: stdout chunk (state=3): >>>/root <<< 7530 1727096039.64750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096039.64782: stderr chunk (state=3): >>><<< 7530 1727096039.64785: stdout chunk (state=3): >>><<< 7530 1727096039.64809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096039.64822: _low_level_execute_command(): starting 7530 1727096039.64827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666 `" && echo ansible-tmp-1727096039.6480951-8660-221451214802666="` echo /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666 `" ) && sleep 0' 7530 1727096039.65290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096039.65294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096039.65299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096039.65309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096039.65311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096039.65353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096039.65357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096039.65400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096039.67424: stdout chunk (state=3): >>>ansible-tmp-1727096039.6480951-8660-221451214802666=/root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666 <<< 7530 1727096039.67583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096039.67587: stdout chunk (state=3): >>><<< 7530 1727096039.67589: stderr chunk (state=3): >>><<< 7530 1727096039.67607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096039.6480951-8660-221451214802666=/root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096039.67647: variable 'ansible_module_compression' from source: unknown 7530 1727096039.67773: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096039.67776: variable 'ansible_facts' from source: unknown 7530 1727096039.67852: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/AnsiballZ_command.py 7530 1727096039.68106: Sending initial data 7530 1727096039.68124: Sent initial data (154 bytes) 7530 1727096039.68779: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096039.68796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096039.68810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096039.68886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096039.70531: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7530 1727096039.70552: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7530 1727096039.70565: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7530 1727096039.70585: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096039.70644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096039.70691: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/AnsiballZ_command.py" <<< 7530 1727096039.70720: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmphqdbh4eh /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/AnsiballZ_command.py <<< 7530 1727096039.70741: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmphqdbh4eh" to remote "/root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/AnsiballZ_command.py" <<< 7530 1727096039.71504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096039.71516: stdout chunk (state=3): >>><<< 7530 1727096039.71533: stderr chunk (state=3): >>><<< 7530 1727096039.71576: done transferring module to remote 7530 1727096039.71592: _low_level_execute_command(): starting 7530 1727096039.71611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/ /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/AnsiballZ_command.py && sleep 0' 7530 1727096039.72393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096039.72450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096039.72485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096039.74441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096039.74474: stdout chunk (state=3): >>><<< 7530 1727096039.74492: stderr chunk (state=3): >>><<< 7530 1727096039.74593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096039.74599: _low_level_execute_command(): starting 7530 1727096039.74602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/AnsiballZ_command.py && sleep 0' 7530 1727096039.75281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096039.75321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096039.75336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096039.75363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096039.75439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096039.91927: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:59.913487", "end": "2024-09-23 08:53:59.916930", "delta": "0:00:00.003443", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096039.93733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096039.93737: stdout chunk (state=3): >>><<< 7530 1727096039.93740: stderr chunk (state=3): >>><<< 7530 1727096039.93975: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:53:59.913487", "end": "2024-09-23 08:53:59.916930", "delta": "0:00:00.003443", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096039.93981: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096039.93985: _low_level_execute_command(): starting 7530 1727096039.93987: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096039.6480951-8660-221451214802666/ > /dev/null 2>&1 && sleep 0' 7530 1727096039.94460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096039.94471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096039.94482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096039.94496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096039.94509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096039.94515: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096039.94533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096039.94547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096039.94555: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096039.94562: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096039.94576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096039.94590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096039.94645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096039.94684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096039.94699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096039.94717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096039.94790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096039.96775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096039.96779: stdout chunk (state=3): >>><<< 7530 1727096039.96974: stderr chunk (state=3): >>><<< 7530 1727096039.96978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096039.96980: handler run complete 7530 1727096039.96981: Evaluated conditional (False): False 7530 1727096039.96983: attempt loop complete, returning result 7530 1727096039.96984: _execute() done 7530 1727096039.96986: dumping result to json 7530 1727096039.96987: done dumping result, returning 7530 1727096039.96989: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-086b-f4f0-000000001341] 7530 1727096039.96990: sending task result for task 0afff68d-5257-086b-f4f0-000000001341 7530 1727096039.97057: done sending task result for task 0afff68d-5257-086b-f4f0-000000001341 7530 1727096039.97060: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003443", "end": "2024-09-23 08:53:59.916930", "rc": 0, "start": "2024-09-23 08:53:59.913487" } STDOUT: eth0 lo 7530 1727096039.97152: no more pending results, returning what we have 7530 1727096039.97156: results queue empty 7530 1727096039.97157: checking for any_errors_fatal 7530 1727096039.97159: done checking for any_errors_fatal 7530 1727096039.97160: checking for max_fail_percentage 7530 1727096039.97161: done checking for max_fail_percentage 7530 1727096039.97162: checking to see if all hosts have failed and the running result is not ok 7530 1727096039.97163: done checking to see if all hosts have failed 7530 1727096039.97164: getting the remaining hosts for this loop 7530 1727096039.97166: done getting the remaining hosts for this loop 7530 1727096039.97172: getting the next task for host managed_node3 7530 1727096039.97182: done getting next task for host managed_node3 7530 1727096039.97185: ^ task is: TASK: Set current_interfaces 7530 1727096039.97191: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096039.97198: getting variables 7530 1727096039.97200: in VariableManager get_vars() 7530 1727096039.97253: Calling all_inventory to load vars for managed_node3 7530 1727096039.97257: Calling groups_inventory to load vars for managed_node3 7530 1727096039.97260: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096039.97390: Calling all_plugins_play to load vars for managed_node3 7530 1727096039.97395: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096039.97398: Calling groups_plugins_play to load vars for managed_node3 7530 1727096039.98916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096040.01682: done with get_vars() 7530 1727096040.01720: done getting variables 7530 1727096040.01798: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:54:00 -0400 (0:00:00.413) 0:00:30.806 ****** 7530 1727096040.01843: entering _queue_task() for managed_node3/set_fact 7530 1727096040.02307: worker is 1 (out of 1 available) 7530 1727096040.02319: exiting _queue_task() for managed_node3/set_fact 7530 1727096040.02330: done queuing things up, now waiting for results queue to drain 7530 1727096040.02332: waiting for pending results... 7530 1727096040.02548: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7530 1727096040.02673: in run() - task 0afff68d-5257-086b-f4f0-000000001342 7530 1727096040.02688: variable 'ansible_search_path' from source: unknown 7530 1727096040.02692: variable 'ansible_search_path' from source: unknown 7530 1727096040.02734: calling self._execute() 7530 1727096040.02835: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.02840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.02851: variable 'omit' from source: magic vars 7530 1727096040.03247: variable 'ansible_distribution_major_version' from source: facts 7530 1727096040.03264: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096040.03473: variable 'omit' from source: magic vars 7530 1727096040.03478: variable 'omit' from source: magic vars 7530 1727096040.03482: variable '_current_interfaces' from source: set_fact 7530 1727096040.03536: variable 'omit' from source: magic vars 7530 1727096040.03587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096040.03626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096040.03645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096040.03662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096040.03674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096040.03725: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096040.03731: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.03734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.03973: Set connection var ansible_pipelining to False 7530 1727096040.03976: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096040.03980: Set connection var ansible_timeout to 10 7530 1727096040.03982: Set connection var ansible_shell_executable to /bin/sh 7530 1727096040.03986: Set connection var ansible_shell_type to sh 7530 1727096040.03988: Set connection var ansible_connection to ssh 7530 1727096040.04374: variable 'ansible_shell_executable' from source: unknown 7530 1727096040.04378: variable 'ansible_connection' from source: unknown 7530 1727096040.04381: variable 'ansible_module_compression' from source: unknown 7530 1727096040.04383: variable 'ansible_shell_type' from source: unknown 7530 1727096040.04385: variable 'ansible_shell_executable' from source: unknown 7530 1727096040.04387: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.04389: variable 'ansible_pipelining' from source: unknown 7530 1727096040.04391: variable 'ansible_timeout' from source: unknown 7530 1727096040.04393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.04396: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096040.04409: variable 'omit' from source: magic vars 7530 1727096040.04415: starting attempt loop 7530 1727096040.04417: running the handler 7530 1727096040.04433: handler run complete 7530 1727096040.04442: attempt loop complete, returning result 7530 1727096040.04445: _execute() done 7530 1727096040.04447: dumping result to json 7530 1727096040.04450: done dumping result, returning 7530 1727096040.04459: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-086b-f4f0-000000001342] 7530 1727096040.04463: sending task result for task 0afff68d-5257-086b-f4f0-000000001342 7530 1727096040.04842: done sending task result for task 0afff68d-5257-086b-f4f0-000000001342 7530 1727096040.04846: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7530 1727096040.04909: no more pending results, returning what we have 7530 1727096040.04913: results queue empty 7530 1727096040.04914: checking for any_errors_fatal 7530 1727096040.04924: done checking for any_errors_fatal 7530 1727096040.04925: checking for max_fail_percentage 7530 1727096040.04926: done checking for max_fail_percentage 7530 1727096040.04927: checking to see if all hosts have failed and the running result is not ok 7530 1727096040.04928: done checking to see if all hosts have failed 7530 1727096040.04929: getting the remaining hosts for this loop 7530 1727096040.04930: done getting the remaining hosts for this loop 7530 1727096040.04935: getting the next task for host managed_node3 7530 1727096040.04945: done getting next task for host managed_node3 7530 1727096040.04948: ^ task is: TASK: Show current_interfaces 7530 1727096040.04952: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096040.04956: getting variables 7530 1727096040.04958: in VariableManager get_vars() 7530 1727096040.05015: Calling all_inventory to load vars for managed_node3 7530 1727096040.05018: Calling groups_inventory to load vars for managed_node3 7530 1727096040.05020: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096040.05034: Calling all_plugins_play to load vars for managed_node3 7530 1727096040.05038: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096040.05041: Calling groups_plugins_play to load vars for managed_node3 7530 1727096040.08044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096040.11248: done with get_vars() 7530 1727096040.11286: done getting variables 7530 1727096040.11349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:54:00 -0400 (0:00:00.097) 0:00:30.904 ****** 7530 1727096040.11586: entering _queue_task() for managed_node3/debug 7530 1727096040.12157: worker is 1 (out of 1 available) 7530 1727096040.12572: exiting _queue_task() for managed_node3/debug 7530 1727096040.12583: done queuing things up, now waiting for results queue to drain 7530 1727096040.12585: waiting for pending results... 7530 1727096040.12788: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7530 1727096040.13012: in run() - task 0afff68d-5257-086b-f4f0-00000000130b 7530 1727096040.13143: variable 'ansible_search_path' from source: unknown 7530 1727096040.13147: variable 'ansible_search_path' from source: unknown 7530 1727096040.13186: calling self._execute() 7530 1727096040.13317: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.13324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.13374: variable 'omit' from source: magic vars 7530 1727096040.14253: variable 'ansible_distribution_major_version' from source: facts 7530 1727096040.14264: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096040.14273: variable 'omit' from source: magic vars 7530 1727096040.14322: variable 'omit' from source: magic vars 7530 1727096040.14545: variable 'current_interfaces' from source: set_fact 7530 1727096040.14772: variable 'omit' from source: magic vars 7530 1727096040.14775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096040.14779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096040.14900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096040.14917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096040.14931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096040.14959: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096040.14962: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.14965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.15085: Set connection var ansible_pipelining to False 7530 1727096040.15088: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096040.15210: Set connection var ansible_timeout to 10 7530 1727096040.15222: Set connection var ansible_shell_executable to /bin/sh 7530 1727096040.15225: Set connection var ansible_shell_type to sh 7530 1727096040.15227: Set connection var ansible_connection to ssh 7530 1727096040.15256: variable 'ansible_shell_executable' from source: unknown 7530 1727096040.15260: variable 'ansible_connection' from source: unknown 7530 1727096040.15262: variable 'ansible_module_compression' from source: unknown 7530 1727096040.15265: variable 'ansible_shell_type' from source: unknown 7530 1727096040.15269: variable 'ansible_shell_executable' from source: unknown 7530 1727096040.15271: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.15572: variable 'ansible_pipelining' from source: unknown 7530 1727096040.15575: variable 'ansible_timeout' from source: unknown 7530 1727096040.15577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.15695: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096040.15706: variable 'omit' from source: magic vars 7530 1727096040.15711: starting attempt loop 7530 1727096040.15714: running the handler 7530 1727096040.15762: handler run complete 7530 1727096040.15891: attempt loop complete, returning result 7530 1727096040.15894: _execute() done 7530 1727096040.15897: dumping result to json 7530 1727096040.15899: done dumping result, returning 7530 1727096040.15908: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-086b-f4f0-00000000130b] 7530 1727096040.15913: sending task result for task 0afff68d-5257-086b-f4f0-00000000130b 7530 1727096040.16006: done sending task result for task 0afff68d-5257-086b-f4f0-00000000130b 7530 1727096040.16010: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7530 1727096040.16064: no more pending results, returning what we have 7530 1727096040.16070: results queue empty 7530 1727096040.16071: checking for any_errors_fatal 7530 1727096040.16077: done checking for any_errors_fatal 7530 1727096040.16078: checking for max_fail_percentage 7530 1727096040.16080: done checking for max_fail_percentage 7530 1727096040.16081: checking to see if all hosts have failed and the running result is not ok 7530 1727096040.16082: done checking to see if all hosts have failed 7530 1727096040.16083: getting the remaining hosts for this loop 7530 1727096040.16085: done getting the remaining hosts for this loop 7530 1727096040.16089: getting the next task for host managed_node3 7530 1727096040.16098: done getting next task for host managed_node3 7530 1727096040.16101: ^ task is: TASK: Install iproute 7530 1727096040.16104: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096040.16109: getting variables 7530 1727096040.16112: in VariableManager get_vars() 7530 1727096040.16370: Calling all_inventory to load vars for managed_node3 7530 1727096040.16374: Calling groups_inventory to load vars for managed_node3 7530 1727096040.16377: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096040.16388: Calling all_plugins_play to load vars for managed_node3 7530 1727096040.16392: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096040.16394: Calling groups_plugins_play to load vars for managed_node3 7530 1727096040.19247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096040.20910: done with get_vars() 7530 1727096040.20944: done getting variables 7530 1727096040.21014: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 08:54:00 -0400 (0:00:00.094) 0:00:30.998 ****** 7530 1727096040.21045: entering _queue_task() for managed_node3/package 7530 1727096040.21671: worker is 1 (out of 1 available) 7530 1727096040.21680: exiting _queue_task() for managed_node3/package 7530 1727096040.21690: done queuing things up, now waiting for results queue to drain 7530 1727096040.21692: waiting for pending results... 7530 1727096040.21825: running TaskExecutor() for managed_node3/TASK: Install iproute 7530 1727096040.21876: in run() - task 0afff68d-5257-086b-f4f0-0000000010ad 7530 1727096040.21897: variable 'ansible_search_path' from source: unknown 7530 1727096040.21905: variable 'ansible_search_path' from source: unknown 7530 1727096040.21953: calling self._execute() 7530 1727096040.22062: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.22077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.22137: variable 'omit' from source: magic vars 7530 1727096040.22491: variable 'ansible_distribution_major_version' from source: facts 7530 1727096040.22510: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096040.22522: variable 'omit' from source: magic vars 7530 1727096040.22574: variable 'omit' from source: magic vars 7530 1727096040.22777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096040.25071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096040.25079: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096040.25082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096040.25200: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096040.25204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096040.25253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096040.25305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096040.25338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096040.25381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096040.25396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096040.25514: variable '__network_is_ostree' from source: set_fact 7530 1727096040.25532: variable 'omit' from source: magic vars 7530 1727096040.25571: variable 'omit' from source: magic vars 7530 1727096040.25606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096040.25645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096040.25745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096040.25748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096040.25750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096040.25752: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096040.25753: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.25755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.25856: Set connection var ansible_pipelining to False 7530 1727096040.25871: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096040.25883: Set connection var ansible_timeout to 10 7530 1727096040.25897: Set connection var ansible_shell_executable to /bin/sh 7530 1727096040.25905: Set connection var ansible_shell_type to sh 7530 1727096040.25911: Set connection var ansible_connection to ssh 7530 1727096040.25940: variable 'ansible_shell_executable' from source: unknown 7530 1727096040.25948: variable 'ansible_connection' from source: unknown 7530 1727096040.25961: variable 'ansible_module_compression' from source: unknown 7530 1727096040.25972: variable 'ansible_shell_type' from source: unknown 7530 1727096040.26385: variable 'ansible_shell_executable' from source: unknown 7530 1727096040.26389: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096040.26391: variable 'ansible_pipelining' from source: unknown 7530 1727096040.26394: variable 'ansible_timeout' from source: unknown 7530 1727096040.26396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096040.26494: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096040.26512: variable 'omit' from source: magic vars 7530 1727096040.26523: starting attempt loop 7530 1727096040.26530: running the handler 7530 1727096040.26542: variable 'ansible_facts' from source: unknown 7530 1727096040.26549: variable 'ansible_facts' from source: unknown 7530 1727096040.26590: _low_level_execute_command(): starting 7530 1727096040.26711: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096040.28133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096040.28211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096040.28257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096040.28414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096040.28542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096040.30230: stdout chunk (state=3): >>>/root <<< 7530 1727096040.30610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096040.30614: stdout chunk (state=3): >>><<< 7530 1727096040.30617: stderr chunk (state=3): >>><<< 7530 1727096040.30774: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096040.30789: _low_level_execute_command(): starting 7530 1727096040.30792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414 `" && echo ansible-tmp-1727096040.3065264-8681-17049422316414="` echo /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414 `" ) && sleep 0' 7530 1727096040.31946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096040.31982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096040.32008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096040.32076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096040.32275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096040.32292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096040.32361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096040.34375: stdout chunk (state=3): >>>ansible-tmp-1727096040.3065264-8681-17049422316414=/root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414 <<< 7530 1727096040.34486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096040.34557: stderr chunk (state=3): >>><<< 7530 1727096040.34561: stdout chunk (state=3): >>><<< 7530 1727096040.34782: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096040.3065264-8681-17049422316414=/root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096040.34817: variable 'ansible_module_compression' from source: unknown 7530 1727096040.34891: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7530 1727096040.34947: variable 'ansible_facts' from source: unknown 7530 1727096040.35158: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/AnsiballZ_dnf.py 7530 1727096040.35525: Sending initial data 7530 1727096040.35529: Sent initial data (149 bytes) 7530 1727096040.36109: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096040.36159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096040.36232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096040.36255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096040.36288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096040.36357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096040.38077: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096040.38259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/AnsiballZ_dnf.py" <<< 7530 1727096040.38267: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmproolpy0g /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/AnsiballZ_dnf.py <<< 7530 1727096040.38510: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmproolpy0g" to remote "/root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/AnsiballZ_dnf.py" <<< 7530 1727096040.39878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096040.39971: stderr chunk (state=3): >>><<< 7530 1727096040.40050: stdout chunk (state=3): >>><<< 7530 1727096040.40053: done transferring module to remote 7530 1727096040.40087: _low_level_execute_command(): starting 7530 1727096040.40097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/ /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/AnsiballZ_dnf.py && sleep 0' 7530 1727096040.40792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096040.40807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096040.40940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096040.40962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096040.40982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096040.41059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096040.43152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096040.43157: stdout chunk (state=3): >>><<< 7530 1727096040.43159: stderr chunk (state=3): >>><<< 7530 1727096040.43249: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096040.43252: _low_level_execute_command(): starting 7530 1727096040.43255: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/AnsiballZ_dnf.py && sleep 0' 7530 1727096040.44216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096040.44223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096040.44234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096040.44248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096040.44269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096040.44277: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096040.44288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096040.44302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096040.44385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096040.44396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096040.44436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096040.44491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096040.88970: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7530 1727096040.94305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096040.94334: stderr chunk (state=3): >>><<< 7530 1727096040.94338: stdout chunk (state=3): >>><<< 7530 1727096040.94352: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096040.94390: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096040.94398: _low_level_execute_command(): starting 7530 1727096040.94401: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096040.3065264-8681-17049422316414/ > /dev/null 2>&1 && sleep 0' 7530 1727096040.94858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096040.94863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096040.94889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096040.94892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096040.94942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096040.94946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096040.94955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096040.95000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096040.96922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096040.96954: stderr chunk (state=3): >>><<< 7530 1727096040.96957: stdout chunk (state=3): >>><<< 7530 1727096040.96974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096040.96981: handler run complete 7530 1727096040.97105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096040.97235: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096040.97302: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096040.97307: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096040.97476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096040.97479: variable '__install_status' from source: set_fact 7530 1727096040.97481: Evaluated conditional (__install_status is success): True 7530 1727096040.97483: attempt loop complete, returning result 7530 1727096040.97485: _execute() done 7530 1727096040.97487: dumping result to json 7530 1727096040.97489: done dumping result, returning 7530 1727096040.97491: done running TaskExecutor() for managed_node3/TASK: Install iproute [0afff68d-5257-086b-f4f0-0000000010ad] 7530 1727096040.97492: sending task result for task 0afff68d-5257-086b-f4f0-0000000010ad 7530 1727096040.97596: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010ad 7530 1727096040.97599: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7530 1727096040.97760: no more pending results, returning what we have 7530 1727096040.97764: results queue empty 7530 1727096040.97765: checking for any_errors_fatal 7530 1727096040.97772: done checking for any_errors_fatal 7530 1727096040.97773: checking for max_fail_percentage 7530 1727096040.97774: done checking for max_fail_percentage 7530 1727096040.97775: checking to see if all hosts have failed and the running result is not ok 7530 1727096040.97776: done checking to see if all hosts have failed 7530 1727096040.97777: getting the remaining hosts for this loop 7530 1727096040.97778: done getting the remaining hosts for this loop 7530 1727096040.97782: getting the next task for host managed_node3 7530 1727096040.97787: done getting next task for host managed_node3 7530 1727096040.97789: ^ task is: TASK: Create veth interface {{ interface }} 7530 1727096040.97792: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096040.97797: getting variables 7530 1727096040.97798: in VariableManager get_vars() 7530 1727096040.97843: Calling all_inventory to load vars for managed_node3 7530 1727096040.97845: Calling groups_inventory to load vars for managed_node3 7530 1727096040.97848: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096040.97857: Calling all_plugins_play to load vars for managed_node3 7530 1727096040.97860: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096040.97862: Calling groups_plugins_play to load vars for managed_node3 7530 1727096040.99774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096041.00713: done with get_vars() 7530 1727096041.00747: done getting variables 7530 1727096041.00813: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096041.00939: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 08:54:01 -0400 (0:00:00.799) 0:00:31.798 ****** 7530 1727096041.00982: entering _queue_task() for managed_node3/command 7530 1727096041.01354: worker is 1 (out of 1 available) 7530 1727096041.01372: exiting _queue_task() for managed_node3/command 7530 1727096041.01384: done queuing things up, now waiting for results queue to drain 7530 1727096041.01386: waiting for pending results... 7530 1727096041.02089: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7530 1727096041.02095: in run() - task 0afff68d-5257-086b-f4f0-0000000010ae 7530 1727096041.02105: variable 'ansible_search_path' from source: unknown 7530 1727096041.02113: variable 'ansible_search_path' from source: unknown 7530 1727096041.02402: variable 'interface' from source: play vars 7530 1727096041.02493: variable 'interface' from source: play vars 7530 1727096041.02571: variable 'interface' from source: play vars 7530 1727096041.02726: Loaded config def from plugin (lookup/items) 7530 1727096041.02742: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7530 1727096041.02773: variable 'omit' from source: magic vars 7530 1727096041.02918: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.02935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.02950: variable 'omit' from source: magic vars 7530 1727096041.03217: variable 'ansible_distribution_major_version' from source: facts 7530 1727096041.03234: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096041.03441: variable 'type' from source: play vars 7530 1727096041.03451: variable 'state' from source: include params 7530 1727096041.03462: variable 'interface' from source: play vars 7530 1727096041.03673: variable 'current_interfaces' from source: set_fact 7530 1727096041.03677: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7530 1727096041.03680: variable 'omit' from source: magic vars 7530 1727096041.03683: variable 'omit' from source: magic vars 7530 1727096041.03685: variable 'item' from source: unknown 7530 1727096041.03687: variable 'item' from source: unknown 7530 1727096041.03690: variable 'omit' from source: magic vars 7530 1727096041.03709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096041.03747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096041.03772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096041.03793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096041.03809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096041.03847: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096041.03857: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.03865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.04273: Set connection var ansible_pipelining to False 7530 1727096041.04277: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096041.04279: Set connection var ansible_timeout to 10 7530 1727096041.04281: Set connection var ansible_shell_executable to /bin/sh 7530 1727096041.04283: Set connection var ansible_shell_type to sh 7530 1727096041.04285: Set connection var ansible_connection to ssh 7530 1727096041.04287: variable 'ansible_shell_executable' from source: unknown 7530 1727096041.04289: variable 'ansible_connection' from source: unknown 7530 1727096041.04291: variable 'ansible_module_compression' from source: unknown 7530 1727096041.04293: variable 'ansible_shell_type' from source: unknown 7530 1727096041.04295: variable 'ansible_shell_executable' from source: unknown 7530 1727096041.04297: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.04298: variable 'ansible_pipelining' from source: unknown 7530 1727096041.04300: variable 'ansible_timeout' from source: unknown 7530 1727096041.04302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.04571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096041.04589: variable 'omit' from source: magic vars 7530 1727096041.04598: starting attempt loop 7530 1727096041.04605: running the handler 7530 1727096041.04625: _low_level_execute_command(): starting 7530 1727096041.04640: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096041.05642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.05690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.05708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096041.05799: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.05831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.05897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.07755: stdout chunk (state=3): >>>/root <<< 7530 1727096041.07827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.07844: stdout chunk (state=3): >>><<< 7530 1727096041.07858: stderr chunk (state=3): >>><<< 7530 1727096041.08196: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.08201: _low_level_execute_command(): starting 7530 1727096041.08206: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324 `" && echo ansible-tmp-1727096041.080981-8718-29070220883324="` echo /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324 `" ) && sleep 0' 7530 1727096041.08971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.08989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.09007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.09031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096041.09051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096041.09062: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096041.09078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.09173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.09187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.09255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.11320: stdout chunk (state=3): >>>ansible-tmp-1727096041.080981-8718-29070220883324=/root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324 <<< 7530 1727096041.11416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.11420: stdout chunk (state=3): >>><<< 7530 1727096041.11426: stderr chunk (state=3): >>><<< 7530 1727096041.11445: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096041.080981-8718-29070220883324=/root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.11477: variable 'ansible_module_compression' from source: unknown 7530 1727096041.11521: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096041.11550: variable 'ansible_facts' from source: unknown 7530 1727096041.11608: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/AnsiballZ_command.py 7530 1727096041.11717: Sending initial data 7530 1727096041.11721: Sent initial data (152 bytes) 7530 1727096041.12165: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.12210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096041.12213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.12215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096041.12218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.12273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.12277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.12280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.12328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.13995: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7530 1727096041.14001: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096041.14025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096041.14056: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpfdxr086z /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/AnsiballZ_command.py <<< 7530 1727096041.14071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/AnsiballZ_command.py" <<< 7530 1727096041.14089: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpfdxr086z" to remote "/root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/AnsiballZ_command.py" <<< 7530 1727096041.14570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.14615: stderr chunk (state=3): >>><<< 7530 1727096041.14619: stdout chunk (state=3): >>><<< 7530 1727096041.14651: done transferring module to remote 7530 1727096041.14660: _low_level_execute_command(): starting 7530 1727096041.14665: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/ /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/AnsiballZ_command.py && sleep 0' 7530 1727096041.15118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.15126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096041.15131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096041.15134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.15136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.15174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.15187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.15230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.17061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.17095: stderr chunk (state=3): >>><<< 7530 1727096041.17098: stdout chunk (state=3): >>><<< 7530 1727096041.17109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.17130: _low_level_execute_command(): starting 7530 1727096041.17134: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/AnsiballZ_command.py && sleep 0' 7530 1727096041.17552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.17556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096041.17588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.17591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.17593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.17644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.17647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.17661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.17704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.34618: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-23 08:54:01.336975", "end": "2024-09-23 08:54:01.342390", "delta": "0:00:00.005415", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096041.38019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096041.38076: stderr chunk (state=3): >>><<< 7530 1727096041.38080: stdout chunk (state=3): >>><<< 7530 1727096041.38216: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-23 08:54:01.336975", "end": "2024-09-23 08:54:01.342390", "delta": "0:00:00.005415", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096041.38226: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096041.38238: _low_level_execute_command(): starting 7530 1727096041.38241: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096041.080981-8718-29070220883324/ > /dev/null 2>&1 && sleep 0' 7530 1727096041.38782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.38805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.38819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.38842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096041.38858: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096041.38888: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096041.38916: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096041.38981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.39010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.39037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.39050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.39218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.42011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.42038: stderr chunk (state=3): >>><<< 7530 1727096041.42041: stdout chunk (state=3): >>><<< 7530 1727096041.42064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.42073: handler run complete 7530 1727096041.42088: Evaluated conditional (False): False 7530 1727096041.42097: attempt loop complete, returning result 7530 1727096041.42113: variable 'item' from source: unknown 7530 1727096041.42184: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.005415", "end": "2024-09-23 08:54:01.342390", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-23 08:54:01.336975" } 7530 1727096041.42366: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.42381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.42384: variable 'omit' from source: magic vars 7530 1727096041.42444: variable 'ansible_distribution_major_version' from source: facts 7530 1727096041.42448: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096041.42574: variable 'type' from source: play vars 7530 1727096041.42577: variable 'state' from source: include params 7530 1727096041.42580: variable 'interface' from source: play vars 7530 1727096041.42585: variable 'current_interfaces' from source: set_fact 7530 1727096041.42596: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7530 1727096041.42599: variable 'omit' from source: magic vars 7530 1727096041.42610: variable 'omit' from source: magic vars 7530 1727096041.42640: variable 'item' from source: unknown 7530 1727096041.42685: variable 'item' from source: unknown 7530 1727096041.42703: variable 'omit' from source: magic vars 7530 1727096041.42717: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096041.42725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096041.42731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096041.42745: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096041.42748: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.42750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.42801: Set connection var ansible_pipelining to False 7530 1727096041.42811: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096041.42815: Set connection var ansible_timeout to 10 7530 1727096041.42818: Set connection var ansible_shell_executable to /bin/sh 7530 1727096041.42820: Set connection var ansible_shell_type to sh 7530 1727096041.42823: Set connection var ansible_connection to ssh 7530 1727096041.42844: variable 'ansible_shell_executable' from source: unknown 7530 1727096041.42847: variable 'ansible_connection' from source: unknown 7530 1727096041.42850: variable 'ansible_module_compression' from source: unknown 7530 1727096041.42852: variable 'ansible_shell_type' from source: unknown 7530 1727096041.42854: variable 'ansible_shell_executable' from source: unknown 7530 1727096041.42856: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.42858: variable 'ansible_pipelining' from source: unknown 7530 1727096041.42861: variable 'ansible_timeout' from source: unknown 7530 1727096041.42865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.42937: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096041.42944: variable 'omit' from source: magic vars 7530 1727096041.42948: starting attempt loop 7530 1727096041.42950: running the handler 7530 1727096041.42958: _low_level_execute_command(): starting 7530 1727096041.42961: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096041.43434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.43442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.43445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.43447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.43501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.43509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.43511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.43548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.45240: stdout chunk (state=3): >>>/root <<< 7530 1727096041.45403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.45407: stdout chunk (state=3): >>><<< 7530 1727096041.45410: stderr chunk (state=3): >>><<< 7530 1727096041.45426: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.45520: _low_level_execute_command(): starting 7530 1727096041.45524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192 `" && echo ansible-tmp-1727096041.4543426-8718-233197633214192="` echo /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192 `" ) && sleep 0' 7530 1727096041.46101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.46118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.46137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.46155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096041.46175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096041.46186: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096041.46198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.46224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096041.46239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096041.46280: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.46345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.46361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.46385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.46452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.48487: stdout chunk (state=3): >>>ansible-tmp-1727096041.4543426-8718-233197633214192=/root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192 <<< 7530 1727096041.48655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.48659: stdout chunk (state=3): >>><<< 7530 1727096041.48661: stderr chunk (state=3): >>><<< 7530 1727096041.48874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096041.4543426-8718-233197633214192=/root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.48877: variable 'ansible_module_compression' from source: unknown 7530 1727096041.48879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096041.48881: variable 'ansible_facts' from source: unknown 7530 1727096041.48883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/AnsiballZ_command.py 7530 1727096041.49019: Sending initial data 7530 1727096041.49027: Sent initial data (154 bytes) 7530 1727096041.49659: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.49779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.49820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.49858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.49937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.51606: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096041.51662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096041.51703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpd2b5dm0e /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/AnsiballZ_command.py <<< 7530 1727096041.51707: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/AnsiballZ_command.py" <<< 7530 1727096041.51750: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpd2b5dm0e" to remote "/root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/AnsiballZ_command.py" <<< 7530 1727096041.52426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.52478: stderr chunk (state=3): >>><<< 7530 1727096041.52551: stdout chunk (state=3): >>><<< 7530 1727096041.52561: done transferring module to remote 7530 1727096041.52576: _low_level_execute_command(): starting 7530 1727096041.52585: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/ /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/AnsiballZ_command.py && sleep 0' 7530 1727096041.53256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.53274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.53332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.53410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.53436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.53462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.53542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.55482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.55575: stdout chunk (state=3): >>><<< 7530 1727096041.55578: stderr chunk (state=3): >>><<< 7530 1727096041.55581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.55583: _low_level_execute_command(): starting 7530 1727096041.55585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/AnsiballZ_command.py && sleep 0' 7530 1727096041.56187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.56205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.56222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.56330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096041.56344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.56375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.56454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.73356: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-23 08:54:01.725282", "end": "2024-09-23 08:54:01.729424", "delta": "0:00:00.004142", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096041.75050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096041.75059: stdout chunk (state=3): >>><<< 7530 1727096041.75062: stderr chunk (state=3): >>><<< 7530 1727096041.75204: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-23 08:54:01.725282", "end": "2024-09-23 08:54:01.729424", "delta": "0:00:00.004142", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096041.75208: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096041.75210: _low_level_execute_command(): starting 7530 1727096041.75213: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096041.4543426-8718-233197633214192/ > /dev/null 2>&1 && sleep 0' 7530 1727096041.76239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.76297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.76520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.76673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.76696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.78705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.78709: stdout chunk (state=3): >>><<< 7530 1727096041.78711: stderr chunk (state=3): >>><<< 7530 1727096041.78735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.78873: handler run complete 7530 1727096041.78877: Evaluated conditional (False): False 7530 1727096041.78879: attempt loop complete, returning result 7530 1727096041.78881: variable 'item' from source: unknown 7530 1727096041.78893: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.004142", "end": "2024-09-23 08:54:01.729424", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-23 08:54:01.725282" } 7530 1727096041.79173: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.79176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.79179: variable 'omit' from source: magic vars 7530 1727096041.79348: variable 'ansible_distribution_major_version' from source: facts 7530 1727096041.79358: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096041.79616: variable 'type' from source: play vars 7530 1727096041.79630: variable 'state' from source: include params 7530 1727096041.79639: variable 'interface' from source: play vars 7530 1727096041.79678: variable 'current_interfaces' from source: set_fact 7530 1727096041.79747: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7530 1727096041.79751: variable 'omit' from source: magic vars 7530 1727096041.79753: variable 'omit' from source: magic vars 7530 1727096041.79789: variable 'item' from source: unknown 7530 1727096041.80138: variable 'item' from source: unknown 7530 1727096041.80246: variable 'omit' from source: magic vars 7530 1727096041.80250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096041.80252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096041.80254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096041.80260: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096041.80262: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.80264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.80314: Set connection var ansible_pipelining to False 7530 1727096041.80364: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096041.80378: Set connection var ansible_timeout to 10 7530 1727096041.80391: Set connection var ansible_shell_executable to /bin/sh 7530 1727096041.80573: Set connection var ansible_shell_type to sh 7530 1727096041.80576: Set connection var ansible_connection to ssh 7530 1727096041.80578: variable 'ansible_shell_executable' from source: unknown 7530 1727096041.80580: variable 'ansible_connection' from source: unknown 7530 1727096041.80582: variable 'ansible_module_compression' from source: unknown 7530 1727096041.80584: variable 'ansible_shell_type' from source: unknown 7530 1727096041.80586: variable 'ansible_shell_executable' from source: unknown 7530 1727096041.80587: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096041.80589: variable 'ansible_pipelining' from source: unknown 7530 1727096041.80591: variable 'ansible_timeout' from source: unknown 7530 1727096041.80593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096041.80719: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096041.80777: variable 'omit' from source: magic vars 7530 1727096041.80790: starting attempt loop 7530 1727096041.80798: running the handler 7530 1727096041.80812: _low_level_execute_command(): starting 7530 1727096041.80820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096041.81602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.81621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096041.81673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.81739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.81766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.81792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.81903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.83630: stdout chunk (state=3): >>>/root <<< 7530 1727096041.83771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.83784: stdout chunk (state=3): >>><<< 7530 1727096041.83803: stderr chunk (state=3): >>><<< 7530 1727096041.83827: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.83842: _low_level_execute_command(): starting 7530 1727096041.83852: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629 `" && echo ansible-tmp-1727096041.8383262-8718-77645168146629="` echo /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629 `" ) && sleep 0' 7530 1727096041.84580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.84624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.84636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096041.84732: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096041.84735: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.84756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.84847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.84952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.86931: stdout chunk (state=3): >>>ansible-tmp-1727096041.8383262-8718-77645168146629=/root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629 <<< 7530 1727096041.87098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.87102: stdout chunk (state=3): >>><<< 7530 1727096041.87104: stderr chunk (state=3): >>><<< 7530 1727096041.87121: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096041.8383262-8718-77645168146629=/root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.87273: variable 'ansible_module_compression' from source: unknown 7530 1727096041.87276: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096041.87279: variable 'ansible_facts' from source: unknown 7530 1727096041.87287: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/AnsiballZ_command.py 7530 1727096041.87413: Sending initial data 7530 1727096041.87512: Sent initial data (153 bytes) 7530 1727096041.88325: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.88377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096041.88413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.88459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.90121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096041.90187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096041.90230: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp_bu4p240 /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/AnsiballZ_command.py <<< 7530 1727096041.90234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/AnsiballZ_command.py" <<< 7530 1727096041.90275: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp_bu4p240" to remote "/root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/AnsiballZ_command.py" <<< 7530 1727096041.90980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.91082: stderr chunk (state=3): >>><<< 7530 1727096041.91086: stdout chunk (state=3): >>><<< 7530 1727096041.91096: done transferring module to remote 7530 1727096041.91110: _low_level_execute_command(): starting 7530 1727096041.91119: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/ /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/AnsiballZ_command.py && sleep 0' 7530 1727096041.91902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.91906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096041.91932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.91949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.92026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096041.93950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096041.93983: stdout chunk (state=3): >>><<< 7530 1727096041.93988: stderr chunk (state=3): >>><<< 7530 1727096041.94007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096041.94093: _low_level_execute_command(): starting 7530 1727096041.94098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/AnsiballZ_command.py && sleep 0' 7530 1727096041.94643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096041.94829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096041.94837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096041.94864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.11333: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-23 08:54:02.106464", "end": "2024-09-23 08:54:02.110275", "delta": "0:00:00.003811", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096042.13057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096042.13085: stderr chunk (state=3): >>><<< 7530 1727096042.13088: stdout chunk (state=3): >>><<< 7530 1727096042.13112: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-23 08:54:02.106464", "end": "2024-09-23 08:54:02.110275", "delta": "0:00:00.003811", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096042.13140: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096042.13148: _low_level_execute_command(): starting 7530 1727096042.13155: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096041.8383262-8718-77645168146629/ > /dev/null 2>&1 && sleep 0' 7530 1727096042.13695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096042.13699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096042.13729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096042.13733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096042.13735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.13737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.13797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.13800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096042.13802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.13846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.15716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096042.15744: stderr chunk (state=3): >>><<< 7530 1727096042.15747: stdout chunk (state=3): >>><<< 7530 1727096042.15761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096042.15765: handler run complete 7530 1727096042.15785: Evaluated conditional (False): False 7530 1727096042.15793: attempt loop complete, returning result 7530 1727096042.15808: variable 'item' from source: unknown 7530 1727096042.15872: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003811", "end": "2024-09-23 08:54:02.110275", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-23 08:54:02.106464" } 7530 1727096042.15987: dumping result to json 7530 1727096042.15991: done dumping result, returning 7530 1727096042.15995: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0afff68d-5257-086b-f4f0-0000000010ae] 7530 1727096042.15998: sending task result for task 0afff68d-5257-086b-f4f0-0000000010ae 7530 1727096042.16045: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010ae 7530 1727096042.16047: WORKER PROCESS EXITING 7530 1727096042.16113: no more pending results, returning what we have 7530 1727096042.16116: results queue empty 7530 1727096042.16117: checking for any_errors_fatal 7530 1727096042.16124: done checking for any_errors_fatal 7530 1727096042.16125: checking for max_fail_percentage 7530 1727096042.16127: done checking for max_fail_percentage 7530 1727096042.16128: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.16129: done checking to see if all hosts have failed 7530 1727096042.16130: getting the remaining hosts for this loop 7530 1727096042.16131: done getting the remaining hosts for this loop 7530 1727096042.16134: getting the next task for host managed_node3 7530 1727096042.16140: done getting next task for host managed_node3 7530 1727096042.16142: ^ task is: TASK: Set up veth as managed by NetworkManager 7530 1727096042.16145: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.16148: getting variables 7530 1727096042.16150: in VariableManager get_vars() 7530 1727096042.16208: Calling all_inventory to load vars for managed_node3 7530 1727096042.16211: Calling groups_inventory to load vars for managed_node3 7530 1727096042.16214: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.16225: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.16227: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.16230: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.17143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.18009: done with get_vars() 7530 1727096042.18036: done getting variables 7530 1727096042.18085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 08:54:02 -0400 (0:00:01.171) 0:00:32.969 ****** 7530 1727096042.18107: entering _queue_task() for managed_node3/command 7530 1727096042.18375: worker is 1 (out of 1 available) 7530 1727096042.18388: exiting _queue_task() for managed_node3/command 7530 1727096042.18400: done queuing things up, now waiting for results queue to drain 7530 1727096042.18402: waiting for pending results... 7530 1727096042.18594: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7530 1727096042.18666: in run() - task 0afff68d-5257-086b-f4f0-0000000010af 7530 1727096042.18679: variable 'ansible_search_path' from source: unknown 7530 1727096042.18683: variable 'ansible_search_path' from source: unknown 7530 1727096042.18712: calling self._execute() 7530 1727096042.18793: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.18797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.18805: variable 'omit' from source: magic vars 7530 1727096042.19097: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.19108: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.19218: variable 'type' from source: play vars 7530 1727096042.19222: variable 'state' from source: include params 7530 1727096042.19226: Evaluated conditional (type == 'veth' and state == 'present'): True 7530 1727096042.19234: variable 'omit' from source: magic vars 7530 1727096042.19262: variable 'omit' from source: magic vars 7530 1727096042.19338: variable 'interface' from source: play vars 7530 1727096042.19352: variable 'omit' from source: magic vars 7530 1727096042.19389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096042.19419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096042.19439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096042.19452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096042.19463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096042.19488: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096042.19491: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.19494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.19569: Set connection var ansible_pipelining to False 7530 1727096042.19574: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096042.19579: Set connection var ansible_timeout to 10 7530 1727096042.19587: Set connection var ansible_shell_executable to /bin/sh 7530 1727096042.19590: Set connection var ansible_shell_type to sh 7530 1727096042.19592: Set connection var ansible_connection to ssh 7530 1727096042.19615: variable 'ansible_shell_executable' from source: unknown 7530 1727096042.19618: variable 'ansible_connection' from source: unknown 7530 1727096042.19621: variable 'ansible_module_compression' from source: unknown 7530 1727096042.19623: variable 'ansible_shell_type' from source: unknown 7530 1727096042.19626: variable 'ansible_shell_executable' from source: unknown 7530 1727096042.19628: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.19630: variable 'ansible_pipelining' from source: unknown 7530 1727096042.19632: variable 'ansible_timeout' from source: unknown 7530 1727096042.19634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.19739: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096042.19752: variable 'omit' from source: magic vars 7530 1727096042.19763: starting attempt loop 7530 1727096042.19766: running the handler 7530 1727096042.19781: _low_level_execute_command(): starting 7530 1727096042.19788: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096042.20319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.20323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.20326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096042.20333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.20373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.20378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096042.20398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.20430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.22073: stdout chunk (state=3): >>>/root <<< 7530 1727096042.22166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096042.22199: stderr chunk (state=3): >>><<< 7530 1727096042.22203: stdout chunk (state=3): >>><<< 7530 1727096042.22227: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096042.22246: _low_level_execute_command(): starting 7530 1727096042.22251: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974 `" && echo ansible-tmp-1727096042.2223086-8783-276205567968974="` echo /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974 `" ) && sleep 0' 7530 1727096042.22717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096042.22721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096042.22734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.22737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.22739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.22773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.22796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.22832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.24834: stdout chunk (state=3): >>>ansible-tmp-1727096042.2223086-8783-276205567968974=/root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974 <<< 7530 1727096042.24934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096042.24965: stderr chunk (state=3): >>><<< 7530 1727096042.24970: stdout chunk (state=3): >>><<< 7530 1727096042.24992: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096042.2223086-8783-276205567968974=/root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096042.25023: variable 'ansible_module_compression' from source: unknown 7530 1727096042.25065: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096042.25096: variable 'ansible_facts' from source: unknown 7530 1727096042.25154: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/AnsiballZ_command.py 7530 1727096042.25263: Sending initial data 7530 1727096042.25266: Sent initial data (154 bytes) 7530 1727096042.25734: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.25737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.25740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.25742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.25797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.25805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096042.25807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.25844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.27486: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096042.27509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096042.27547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmphsslybbw /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/AnsiballZ_command.py <<< 7530 1727096042.27549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/AnsiballZ_command.py" <<< 7530 1727096042.27578: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmphsslybbw" to remote "/root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/AnsiballZ_command.py" <<< 7530 1727096042.27585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/AnsiballZ_command.py" <<< 7530 1727096042.28057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096042.28106: stderr chunk (state=3): >>><<< 7530 1727096042.28110: stdout chunk (state=3): >>><<< 7530 1727096042.28150: done transferring module to remote 7530 1727096042.28159: _low_level_execute_command(): starting 7530 1727096042.28165: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/ /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/AnsiballZ_command.py && sleep 0' 7530 1727096042.28631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096042.28635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096042.28637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096042.28639: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096042.28645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.28700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.28704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096042.28706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.28731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.30561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096042.30589: stderr chunk (state=3): >>><<< 7530 1727096042.30592: stdout chunk (state=3): >>><<< 7530 1727096042.30607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096042.30610: _low_level_execute_command(): starting 7530 1727096042.30617: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/AnsiballZ_command.py && sleep 0' 7530 1727096042.31085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.31089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.31091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.31094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.31146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.31149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096042.31151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.31202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.49213: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-23 08:54:02.471400", "end": "2024-09-23 08:54:02.489457", "delta": "0:00:00.018057", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096042.50976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096042.51002: stderr chunk (state=3): >>><<< 7530 1727096042.51006: stdout chunk (state=3): >>><<< 7530 1727096042.51022: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-23 08:54:02.471400", "end": "2024-09-23 08:54:02.489457", "delta": "0:00:00.018057", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096042.51061: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096042.51070: _low_level_execute_command(): starting 7530 1727096042.51075: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096042.2223086-8783-276205567968974/ > /dev/null 2>&1 && sleep 0' 7530 1727096042.51538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.51542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.51551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096042.51553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096042.51556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.51603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.51606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096042.51608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.51649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096042.53524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096042.53558: stderr chunk (state=3): >>><<< 7530 1727096042.53561: stdout chunk (state=3): >>><<< 7530 1727096042.53576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096042.53582: handler run complete 7530 1727096042.53600: Evaluated conditional (False): False 7530 1727096042.53609: attempt loop complete, returning result 7530 1727096042.53611: _execute() done 7530 1727096042.53613: dumping result to json 7530 1727096042.53618: done dumping result, returning 7530 1727096042.53625: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-086b-f4f0-0000000010af] 7530 1727096042.53633: sending task result for task 0afff68d-5257-086b-f4f0-0000000010af 7530 1727096042.53731: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010af 7530 1727096042.53734: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.018057", "end": "2024-09-23 08:54:02.489457", "rc": 0, "start": "2024-09-23 08:54:02.471400" } 7530 1727096042.53828: no more pending results, returning what we have 7530 1727096042.53834: results queue empty 7530 1727096042.53835: checking for any_errors_fatal 7530 1727096042.53848: done checking for any_errors_fatal 7530 1727096042.53849: checking for max_fail_percentage 7530 1727096042.53851: done checking for max_fail_percentage 7530 1727096042.53851: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.53853: done checking to see if all hosts have failed 7530 1727096042.53854: getting the remaining hosts for this loop 7530 1727096042.53855: done getting the remaining hosts for this loop 7530 1727096042.53858: getting the next task for host managed_node3 7530 1727096042.53864: done getting next task for host managed_node3 7530 1727096042.53869: ^ task is: TASK: Delete veth interface {{ interface }} 7530 1727096042.53871: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.53875: getting variables 7530 1727096042.53876: in VariableManager get_vars() 7530 1727096042.53919: Calling all_inventory to load vars for managed_node3 7530 1727096042.53922: Calling groups_inventory to load vars for managed_node3 7530 1727096042.53924: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.53936: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.53939: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.53942: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.54727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.55714: done with get_vars() 7530 1727096042.55737: done getting variables 7530 1727096042.55784: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096042.55878: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 08:54:02 -0400 (0:00:00.377) 0:00:33.347 ****** 7530 1727096042.55902: entering _queue_task() for managed_node3/command 7530 1727096042.56172: worker is 1 (out of 1 available) 7530 1727096042.56184: exiting _queue_task() for managed_node3/command 7530 1727096042.56198: done queuing things up, now waiting for results queue to drain 7530 1727096042.56199: waiting for pending results... 7530 1727096042.56380: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7530 1727096042.56445: in run() - task 0afff68d-5257-086b-f4f0-0000000010b0 7530 1727096042.56458: variable 'ansible_search_path' from source: unknown 7530 1727096042.56462: variable 'ansible_search_path' from source: unknown 7530 1727096042.56493: calling self._execute() 7530 1727096042.56577: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.56581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.56590: variable 'omit' from source: magic vars 7530 1727096042.56869: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.56887: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.57026: variable 'type' from source: play vars 7530 1727096042.57029: variable 'state' from source: include params 7530 1727096042.57037: variable 'interface' from source: play vars 7530 1727096042.57041: variable 'current_interfaces' from source: set_fact 7530 1727096042.57049: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7530 1727096042.57052: when evaluation is False, skipping this task 7530 1727096042.57055: _execute() done 7530 1727096042.57057: dumping result to json 7530 1727096042.57059: done dumping result, returning 7530 1727096042.57066: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0afff68d-5257-086b-f4f0-0000000010b0] 7530 1727096042.57076: sending task result for task 0afff68d-5257-086b-f4f0-0000000010b0 7530 1727096042.57164: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010b0 7530 1727096042.57169: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096042.57236: no more pending results, returning what we have 7530 1727096042.57240: results queue empty 7530 1727096042.57241: checking for any_errors_fatal 7530 1727096042.57249: done checking for any_errors_fatal 7530 1727096042.57250: checking for max_fail_percentage 7530 1727096042.57252: done checking for max_fail_percentage 7530 1727096042.57252: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.57254: done checking to see if all hosts have failed 7530 1727096042.57254: getting the remaining hosts for this loop 7530 1727096042.57256: done getting the remaining hosts for this loop 7530 1727096042.57260: getting the next task for host managed_node3 7530 1727096042.57266: done getting next task for host managed_node3 7530 1727096042.57272: ^ task is: TASK: Create dummy interface {{ interface }} 7530 1727096042.57274: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.57278: getting variables 7530 1727096042.57280: in VariableManager get_vars() 7530 1727096042.57328: Calling all_inventory to load vars for managed_node3 7530 1727096042.57333: Calling groups_inventory to load vars for managed_node3 7530 1727096042.57335: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.57348: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.57350: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.57353: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.58149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.59023: done with get_vars() 7530 1727096042.59052: done getting variables 7530 1727096042.59100: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096042.59189: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 08:54:02 -0400 (0:00:00.033) 0:00:33.380 ****** 7530 1727096042.59214: entering _queue_task() for managed_node3/command 7530 1727096042.59477: worker is 1 (out of 1 available) 7530 1727096042.59492: exiting _queue_task() for managed_node3/command 7530 1727096042.59506: done queuing things up, now waiting for results queue to drain 7530 1727096042.59507: waiting for pending results... 7530 1727096042.59695: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7530 1727096042.59770: in run() - task 0afff68d-5257-086b-f4f0-0000000010b1 7530 1727096042.59782: variable 'ansible_search_path' from source: unknown 7530 1727096042.59786: variable 'ansible_search_path' from source: unknown 7530 1727096042.59819: calling self._execute() 7530 1727096042.59896: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.59901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.59911: variable 'omit' from source: magic vars 7530 1727096042.60194: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.60204: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.60350: variable 'type' from source: play vars 7530 1727096042.60353: variable 'state' from source: include params 7530 1727096042.60357: variable 'interface' from source: play vars 7530 1727096042.60359: variable 'current_interfaces' from source: set_fact 7530 1727096042.60371: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7530 1727096042.60374: when evaluation is False, skipping this task 7530 1727096042.60378: _execute() done 7530 1727096042.60381: dumping result to json 7530 1727096042.60384: done dumping result, returning 7530 1727096042.60386: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0afff68d-5257-086b-f4f0-0000000010b1] 7530 1727096042.60392: sending task result for task 0afff68d-5257-086b-f4f0-0000000010b1 7530 1727096042.60480: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010b1 7530 1727096042.60482: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096042.60547: no more pending results, returning what we have 7530 1727096042.60551: results queue empty 7530 1727096042.60552: checking for any_errors_fatal 7530 1727096042.60557: done checking for any_errors_fatal 7530 1727096042.60558: checking for max_fail_percentage 7530 1727096042.60559: done checking for max_fail_percentage 7530 1727096042.60560: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.60561: done checking to see if all hosts have failed 7530 1727096042.60562: getting the remaining hosts for this loop 7530 1727096042.60563: done getting the remaining hosts for this loop 7530 1727096042.60569: getting the next task for host managed_node3 7530 1727096042.60576: done getting next task for host managed_node3 7530 1727096042.60579: ^ task is: TASK: Delete dummy interface {{ interface }} 7530 1727096042.60582: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.60586: getting variables 7530 1727096042.60588: in VariableManager get_vars() 7530 1727096042.60647: Calling all_inventory to load vars for managed_node3 7530 1727096042.60650: Calling groups_inventory to load vars for managed_node3 7530 1727096042.60652: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.60664: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.60666: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.60671: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.66486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.67364: done with get_vars() 7530 1727096042.67393: done getting variables 7530 1727096042.67434: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096042.67507: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 08:54:02 -0400 (0:00:00.083) 0:00:33.463 ****** 7530 1727096042.67527: entering _queue_task() for managed_node3/command 7530 1727096042.67803: worker is 1 (out of 1 available) 7530 1727096042.67816: exiting _queue_task() for managed_node3/command 7530 1727096042.67831: done queuing things up, now waiting for results queue to drain 7530 1727096042.67833: waiting for pending results... 7530 1727096042.68017: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7530 1727096042.68099: in run() - task 0afff68d-5257-086b-f4f0-0000000010b2 7530 1727096042.68111: variable 'ansible_search_path' from source: unknown 7530 1727096042.68118: variable 'ansible_search_path' from source: unknown 7530 1727096042.68147: calling self._execute() 7530 1727096042.68233: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.68237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.68244: variable 'omit' from source: magic vars 7530 1727096042.68534: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.68543: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.68688: variable 'type' from source: play vars 7530 1727096042.68691: variable 'state' from source: include params 7530 1727096042.68696: variable 'interface' from source: play vars 7530 1727096042.68701: variable 'current_interfaces' from source: set_fact 7530 1727096042.68713: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7530 1727096042.68716: when evaluation is False, skipping this task 7530 1727096042.68718: _execute() done 7530 1727096042.68721: dumping result to json 7530 1727096042.68723: done dumping result, returning 7530 1727096042.68725: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0afff68d-5257-086b-f4f0-0000000010b2] 7530 1727096042.68734: sending task result for task 0afff68d-5257-086b-f4f0-0000000010b2 7530 1727096042.68820: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010b2 7530 1727096042.68823: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096042.68877: no more pending results, returning what we have 7530 1727096042.68881: results queue empty 7530 1727096042.68882: checking for any_errors_fatal 7530 1727096042.68889: done checking for any_errors_fatal 7530 1727096042.68889: checking for max_fail_percentage 7530 1727096042.68891: done checking for max_fail_percentage 7530 1727096042.68892: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.68893: done checking to see if all hosts have failed 7530 1727096042.68894: getting the remaining hosts for this loop 7530 1727096042.68895: done getting the remaining hosts for this loop 7530 1727096042.68899: getting the next task for host managed_node3 7530 1727096042.68905: done getting next task for host managed_node3 7530 1727096042.68908: ^ task is: TASK: Create tap interface {{ interface }} 7530 1727096042.68911: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.68916: getting variables 7530 1727096042.68918: in VariableManager get_vars() 7530 1727096042.68973: Calling all_inventory to load vars for managed_node3 7530 1727096042.68976: Calling groups_inventory to load vars for managed_node3 7530 1727096042.68978: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.68990: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.68993: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.68995: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.69790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.70694: done with get_vars() 7530 1727096042.70721: done getting variables 7530 1727096042.70772: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096042.70863: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 08:54:02 -0400 (0:00:00.033) 0:00:33.497 ****** 7530 1727096042.70888: entering _queue_task() for managed_node3/command 7530 1727096042.71161: worker is 1 (out of 1 available) 7530 1727096042.71175: exiting _queue_task() for managed_node3/command 7530 1727096042.71188: done queuing things up, now waiting for results queue to drain 7530 1727096042.71190: waiting for pending results... 7530 1727096042.71382: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7530 1727096042.71463: in run() - task 0afff68d-5257-086b-f4f0-0000000010b3 7530 1727096042.71478: variable 'ansible_search_path' from source: unknown 7530 1727096042.71482: variable 'ansible_search_path' from source: unknown 7530 1727096042.71513: calling self._execute() 7530 1727096042.71598: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.71602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.71613: variable 'omit' from source: magic vars 7530 1727096042.71920: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.71930: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.72077: variable 'type' from source: play vars 7530 1727096042.72081: variable 'state' from source: include params 7530 1727096042.72086: variable 'interface' from source: play vars 7530 1727096042.72090: variable 'current_interfaces' from source: set_fact 7530 1727096042.72098: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7530 1727096042.72102: when evaluation is False, skipping this task 7530 1727096042.72104: _execute() done 7530 1727096042.72107: dumping result to json 7530 1727096042.72110: done dumping result, returning 7530 1727096042.72116: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0afff68d-5257-086b-f4f0-0000000010b3] 7530 1727096042.72121: sending task result for task 0afff68d-5257-086b-f4f0-0000000010b3 7530 1727096042.72210: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010b3 7530 1727096042.72213: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096042.72265: no more pending results, returning what we have 7530 1727096042.72270: results queue empty 7530 1727096042.72271: checking for any_errors_fatal 7530 1727096042.72277: done checking for any_errors_fatal 7530 1727096042.72278: checking for max_fail_percentage 7530 1727096042.72280: done checking for max_fail_percentage 7530 1727096042.72280: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.72282: done checking to see if all hosts have failed 7530 1727096042.72282: getting the remaining hosts for this loop 7530 1727096042.72284: done getting the remaining hosts for this loop 7530 1727096042.72287: getting the next task for host managed_node3 7530 1727096042.72293: done getting next task for host managed_node3 7530 1727096042.72296: ^ task is: TASK: Delete tap interface {{ interface }} 7530 1727096042.72299: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.72303: getting variables 7530 1727096042.72305: in VariableManager get_vars() 7530 1727096042.72363: Calling all_inventory to load vars for managed_node3 7530 1727096042.72366: Calling groups_inventory to load vars for managed_node3 7530 1727096042.72370: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.72382: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.72385: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.72387: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.73327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.74209: done with get_vars() 7530 1727096042.74236: done getting variables 7530 1727096042.74284: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096042.74373: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 08:54:02 -0400 (0:00:00.035) 0:00:33.532 ****** 7530 1727096042.74398: entering _queue_task() for managed_node3/command 7530 1727096042.74661: worker is 1 (out of 1 available) 7530 1727096042.74677: exiting _queue_task() for managed_node3/command 7530 1727096042.74689: done queuing things up, now waiting for results queue to drain 7530 1727096042.74691: waiting for pending results... 7530 1727096042.74882: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7530 1727096042.74963: in run() - task 0afff68d-5257-086b-f4f0-0000000010b4 7530 1727096042.74976: variable 'ansible_search_path' from source: unknown 7530 1727096042.74980: variable 'ansible_search_path' from source: unknown 7530 1727096042.75010: calling self._execute() 7530 1727096042.75097: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.75102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.75111: variable 'omit' from source: magic vars 7530 1727096042.75402: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.75413: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.75555: variable 'type' from source: play vars 7530 1727096042.75559: variable 'state' from source: include params 7530 1727096042.75562: variable 'interface' from source: play vars 7530 1727096042.75567: variable 'current_interfaces' from source: set_fact 7530 1727096042.75581: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7530 1727096042.75585: when evaluation is False, skipping this task 7530 1727096042.75587: _execute() done 7530 1727096042.75589: dumping result to json 7530 1727096042.75592: done dumping result, returning 7530 1727096042.75595: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0afff68d-5257-086b-f4f0-0000000010b4] 7530 1727096042.75600: sending task result for task 0afff68d-5257-086b-f4f0-0000000010b4 7530 1727096042.75687: done sending task result for task 0afff68d-5257-086b-f4f0-0000000010b4 7530 1727096042.75690: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096042.75741: no more pending results, returning what we have 7530 1727096042.75745: results queue empty 7530 1727096042.75746: checking for any_errors_fatal 7530 1727096042.75750: done checking for any_errors_fatal 7530 1727096042.75751: checking for max_fail_percentage 7530 1727096042.75752: done checking for max_fail_percentage 7530 1727096042.75753: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.75754: done checking to see if all hosts have failed 7530 1727096042.75755: getting the remaining hosts for this loop 7530 1727096042.75756: done getting the remaining hosts for this loop 7530 1727096042.75759: getting the next task for host managed_node3 7530 1727096042.75771: done getting next task for host managed_node3 7530 1727096042.75777: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096042.75781: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.75807: getting variables 7530 1727096042.75809: in VariableManager get_vars() 7530 1727096042.75855: Calling all_inventory to load vars for managed_node3 7530 1727096042.75858: Calling groups_inventory to load vars for managed_node3 7530 1727096042.75860: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.75879: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.75882: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.75885: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.76670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.77559: done with get_vars() 7530 1727096042.77585: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:54:02 -0400 (0:00:00.032) 0:00:33.564 ****** 7530 1727096042.77663: entering _queue_task() for managed_node3/include_tasks 7530 1727096042.77929: worker is 1 (out of 1 available) 7530 1727096042.77943: exiting _queue_task() for managed_node3/include_tasks 7530 1727096042.77955: done queuing things up, now waiting for results queue to drain 7530 1727096042.77956: waiting for pending results... 7530 1727096042.78144: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096042.78251: in run() - task 0afff68d-5257-086b-f4f0-0000000000b8 7530 1727096042.78263: variable 'ansible_search_path' from source: unknown 7530 1727096042.78266: variable 'ansible_search_path' from source: unknown 7530 1727096042.78300: calling self._execute() 7530 1727096042.78384: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.78388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.78402: variable 'omit' from source: magic vars 7530 1727096042.78688: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.78699: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.78705: _execute() done 7530 1727096042.78708: dumping result to json 7530 1727096042.78710: done dumping result, returning 7530 1727096042.78717: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-086b-f4f0-0000000000b8] 7530 1727096042.78728: sending task result for task 0afff68d-5257-086b-f4f0-0000000000b8 7530 1727096042.78818: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000b8 7530 1727096042.78820: WORKER PROCESS EXITING 7530 1727096042.78879: no more pending results, returning what we have 7530 1727096042.78884: in VariableManager get_vars() 7530 1727096042.78942: Calling all_inventory to load vars for managed_node3 7530 1727096042.78945: Calling groups_inventory to load vars for managed_node3 7530 1727096042.78947: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.78958: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.78961: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.78963: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.79866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.80750: done with get_vars() 7530 1727096042.80773: variable 'ansible_search_path' from source: unknown 7530 1727096042.80774: variable 'ansible_search_path' from source: unknown 7530 1727096042.80808: we have included files to process 7530 1727096042.80809: generating all_blocks data 7530 1727096042.80810: done generating all_blocks data 7530 1727096042.80815: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096042.80816: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096042.80817: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096042.81211: done processing included file 7530 1727096042.81213: iterating over new_blocks loaded from include file 7530 1727096042.81214: in VariableManager get_vars() 7530 1727096042.81236: done with get_vars() 7530 1727096042.81237: filtering new block on tags 7530 1727096042.81252: done filtering new block on tags 7530 1727096042.81254: in VariableManager get_vars() 7530 1727096042.81274: done with get_vars() 7530 1727096042.81275: filtering new block on tags 7530 1727096042.81288: done filtering new block on tags 7530 1727096042.81290: in VariableManager get_vars() 7530 1727096042.81306: done with get_vars() 7530 1727096042.81307: filtering new block on tags 7530 1727096042.81318: done filtering new block on tags 7530 1727096042.81319: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7530 1727096042.81324: extending task lists for all hosts with included blocks 7530 1727096042.81783: done extending task lists 7530 1727096042.81785: done processing included files 7530 1727096042.81786: results queue empty 7530 1727096042.81786: checking for any_errors_fatal 7530 1727096042.81789: done checking for any_errors_fatal 7530 1727096042.81790: checking for max_fail_percentage 7530 1727096042.81791: done checking for max_fail_percentage 7530 1727096042.81791: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.81792: done checking to see if all hosts have failed 7530 1727096042.81792: getting the remaining hosts for this loop 7530 1727096042.81793: done getting the remaining hosts for this loop 7530 1727096042.81795: getting the next task for host managed_node3 7530 1727096042.81798: done getting next task for host managed_node3 7530 1727096042.81800: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096042.81802: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.81811: getting variables 7530 1727096042.81812: in VariableManager get_vars() 7530 1727096042.81829: Calling all_inventory to load vars for managed_node3 7530 1727096042.81831: Calling groups_inventory to load vars for managed_node3 7530 1727096042.81833: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.81837: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.81839: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.81840: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.82595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.83456: done with get_vars() 7530 1727096042.83483: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:54:02 -0400 (0:00:00.058) 0:00:33.623 ****** 7530 1727096042.83544: entering _queue_task() for managed_node3/setup 7530 1727096042.83811: worker is 1 (out of 1 available) 7530 1727096042.83824: exiting _queue_task() for managed_node3/setup 7530 1727096042.83836: done queuing things up, now waiting for results queue to drain 7530 1727096042.83838: waiting for pending results... 7530 1727096042.84025: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096042.84150: in run() - task 0afff68d-5257-086b-f4f0-000000001381 7530 1727096042.84162: variable 'ansible_search_path' from source: unknown 7530 1727096042.84166: variable 'ansible_search_path' from source: unknown 7530 1727096042.84200: calling self._execute() 7530 1727096042.84283: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.84289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.84300: variable 'omit' from source: magic vars 7530 1727096042.84582: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.84593: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.84751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096042.86313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096042.86374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096042.86406: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096042.86434: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096042.86453: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096042.86517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096042.86538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096042.86556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096042.86588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096042.86598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096042.86639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096042.86656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096042.86680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096042.86706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096042.86716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096042.86828: variable '__network_required_facts' from source: role '' defaults 7530 1727096042.86838: variable 'ansible_facts' from source: unknown 7530 1727096042.87305: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7530 1727096042.87309: when evaluation is False, skipping this task 7530 1727096042.87312: _execute() done 7530 1727096042.87314: dumping result to json 7530 1727096042.87317: done dumping result, returning 7530 1727096042.87320: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-086b-f4f0-000000001381] 7530 1727096042.87326: sending task result for task 0afff68d-5257-086b-f4f0-000000001381 7530 1727096042.87415: done sending task result for task 0afff68d-5257-086b-f4f0-000000001381 7530 1727096042.87418: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096042.87463: no more pending results, returning what we have 7530 1727096042.87478: results queue empty 7530 1727096042.87479: checking for any_errors_fatal 7530 1727096042.87480: done checking for any_errors_fatal 7530 1727096042.87481: checking for max_fail_percentage 7530 1727096042.87482: done checking for max_fail_percentage 7530 1727096042.87483: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.87484: done checking to see if all hosts have failed 7530 1727096042.87485: getting the remaining hosts for this loop 7530 1727096042.87486: done getting the remaining hosts for this loop 7530 1727096042.87490: getting the next task for host managed_node3 7530 1727096042.87498: done getting next task for host managed_node3 7530 1727096042.87502: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096042.87506: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.87526: getting variables 7530 1727096042.87527: in VariableManager get_vars() 7530 1727096042.87586: Calling all_inventory to load vars for managed_node3 7530 1727096042.87590: Calling groups_inventory to load vars for managed_node3 7530 1727096042.87592: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.87603: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.87605: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.87607: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.88424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.89321: done with get_vars() 7530 1727096042.89349: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:54:02 -0400 (0:00:00.058) 0:00:33.682 ****** 7530 1727096042.89438: entering _queue_task() for managed_node3/stat 7530 1727096042.89702: worker is 1 (out of 1 available) 7530 1727096042.89716: exiting _queue_task() for managed_node3/stat 7530 1727096042.89732: done queuing things up, now waiting for results queue to drain 7530 1727096042.89734: waiting for pending results... 7530 1727096042.89920: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096042.90038: in run() - task 0afff68d-5257-086b-f4f0-000000001383 7530 1727096042.90051: variable 'ansible_search_path' from source: unknown 7530 1727096042.90054: variable 'ansible_search_path' from source: unknown 7530 1727096042.90089: calling self._execute() 7530 1727096042.90165: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.90172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.90180: variable 'omit' from source: magic vars 7530 1727096042.90464: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.90477: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.90597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096042.90801: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096042.90843: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096042.90866: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096042.90894: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096042.90997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096042.91018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096042.91037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096042.91056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096042.91141: variable '__network_is_ostree' from source: set_fact 7530 1727096042.91147: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096042.91150: when evaluation is False, skipping this task 7530 1727096042.91157: _execute() done 7530 1727096042.91160: dumping result to json 7530 1727096042.91162: done dumping result, returning 7530 1727096042.91174: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-086b-f4f0-000000001383] 7530 1727096042.91177: sending task result for task 0afff68d-5257-086b-f4f0-000000001383 7530 1727096042.91265: done sending task result for task 0afff68d-5257-086b-f4f0-000000001383 7530 1727096042.91271: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096042.91321: no more pending results, returning what we have 7530 1727096042.91325: results queue empty 7530 1727096042.91325: checking for any_errors_fatal 7530 1727096042.91332: done checking for any_errors_fatal 7530 1727096042.91333: checking for max_fail_percentage 7530 1727096042.91335: done checking for max_fail_percentage 7530 1727096042.91335: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.91337: done checking to see if all hosts have failed 7530 1727096042.91337: getting the remaining hosts for this loop 7530 1727096042.91339: done getting the remaining hosts for this loop 7530 1727096042.91343: getting the next task for host managed_node3 7530 1727096042.91349: done getting next task for host managed_node3 7530 1727096042.91352: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096042.91356: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.91378: getting variables 7530 1727096042.91380: in VariableManager get_vars() 7530 1727096042.91429: Calling all_inventory to load vars for managed_node3 7530 1727096042.91432: Calling groups_inventory to load vars for managed_node3 7530 1727096042.91435: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.91444: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.91447: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.91449: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.92409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.93300: done with get_vars() 7530 1727096042.93328: done getting variables 7530 1727096042.93377: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:54:02 -0400 (0:00:00.039) 0:00:33.722 ****** 7530 1727096042.93404: entering _queue_task() for managed_node3/set_fact 7530 1727096042.93671: worker is 1 (out of 1 available) 7530 1727096042.93686: exiting _queue_task() for managed_node3/set_fact 7530 1727096042.93699: done queuing things up, now waiting for results queue to drain 7530 1727096042.93701: waiting for pending results... 7530 1727096042.93896: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096042.94017: in run() - task 0afff68d-5257-086b-f4f0-000000001384 7530 1727096042.94031: variable 'ansible_search_path' from source: unknown 7530 1727096042.94035: variable 'ansible_search_path' from source: unknown 7530 1727096042.94069: calling self._execute() 7530 1727096042.94154: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.94158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.94165: variable 'omit' from source: magic vars 7530 1727096042.94449: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.94458: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.94586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096042.94785: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096042.94820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096042.94848: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096042.94876: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096042.94970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096042.94989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096042.95008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096042.95030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096042.95106: variable '__network_is_ostree' from source: set_fact 7530 1727096042.95113: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096042.95116: when evaluation is False, skipping this task 7530 1727096042.95120: _execute() done 7530 1727096042.95123: dumping result to json 7530 1727096042.95126: done dumping result, returning 7530 1727096042.95138: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-086b-f4f0-000000001384] 7530 1727096042.95141: sending task result for task 0afff68d-5257-086b-f4f0-000000001384 7530 1727096042.95227: done sending task result for task 0afff68d-5257-086b-f4f0-000000001384 7530 1727096042.95230: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096042.95285: no more pending results, returning what we have 7530 1727096042.95288: results queue empty 7530 1727096042.95289: checking for any_errors_fatal 7530 1727096042.95296: done checking for any_errors_fatal 7530 1727096042.95297: checking for max_fail_percentage 7530 1727096042.95299: done checking for max_fail_percentage 7530 1727096042.95299: checking to see if all hosts have failed and the running result is not ok 7530 1727096042.95301: done checking to see if all hosts have failed 7530 1727096042.95301: getting the remaining hosts for this loop 7530 1727096042.95303: done getting the remaining hosts for this loop 7530 1727096042.95306: getting the next task for host managed_node3 7530 1727096042.95314: done getting next task for host managed_node3 7530 1727096042.95318: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096042.95322: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096042.95343: getting variables 7530 1727096042.95346: in VariableManager get_vars() 7530 1727096042.95394: Calling all_inventory to load vars for managed_node3 7530 1727096042.95396: Calling groups_inventory to load vars for managed_node3 7530 1727096042.95399: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096042.95409: Calling all_plugins_play to load vars for managed_node3 7530 1727096042.95412: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096042.95414: Calling groups_plugins_play to load vars for managed_node3 7530 1727096042.96248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096042.97143: done with get_vars() 7530 1727096042.97171: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:54:02 -0400 (0:00:00.038) 0:00:33.760 ****** 7530 1727096042.97248: entering _queue_task() for managed_node3/service_facts 7530 1727096042.97548: worker is 1 (out of 1 available) 7530 1727096042.97560: exiting _queue_task() for managed_node3/service_facts 7530 1727096042.97575: done queuing things up, now waiting for results queue to drain 7530 1727096042.97577: waiting for pending results... 7530 1727096042.97993: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096042.98065: in run() - task 0afff68d-5257-086b-f4f0-000000001386 7530 1727096042.98095: variable 'ansible_search_path' from source: unknown 7530 1727096042.98102: variable 'ansible_search_path' from source: unknown 7530 1727096042.98144: calling self._execute() 7530 1727096042.98249: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.98261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.98278: variable 'omit' from source: magic vars 7530 1727096042.98693: variable 'ansible_distribution_major_version' from source: facts 7530 1727096042.98697: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096042.98703: variable 'omit' from source: magic vars 7530 1727096042.98760: variable 'omit' from source: magic vars 7530 1727096042.98789: variable 'omit' from source: magic vars 7530 1727096042.98833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096042.98863: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096042.98882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096042.98897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096042.98912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096042.98934: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096042.98938: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.98940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.99015: Set connection var ansible_pipelining to False 7530 1727096042.99018: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096042.99027: Set connection var ansible_timeout to 10 7530 1727096042.99066: Set connection var ansible_shell_executable to /bin/sh 7530 1727096042.99071: Set connection var ansible_shell_type to sh 7530 1727096042.99074: Set connection var ansible_connection to ssh 7530 1727096042.99076: variable 'ansible_shell_executable' from source: unknown 7530 1727096042.99078: variable 'ansible_connection' from source: unknown 7530 1727096042.99081: variable 'ansible_module_compression' from source: unknown 7530 1727096042.99083: variable 'ansible_shell_type' from source: unknown 7530 1727096042.99085: variable 'ansible_shell_executable' from source: unknown 7530 1727096042.99087: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096042.99089: variable 'ansible_pipelining' from source: unknown 7530 1727096042.99091: variable 'ansible_timeout' from source: unknown 7530 1727096042.99092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096042.99228: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096042.99237: variable 'omit' from source: magic vars 7530 1727096042.99248: starting attempt loop 7530 1727096042.99251: running the handler 7530 1727096042.99259: _low_level_execute_command(): starting 7530 1727096042.99266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096042.99799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.99804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096042.99808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096042.99864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096042.99869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096042.99882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096042.99919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096043.01625: stdout chunk (state=3): >>>/root <<< 7530 1727096043.01780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096043.01784: stdout chunk (state=3): >>><<< 7530 1727096043.01786: stderr chunk (state=3): >>><<< 7530 1727096043.01928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096043.01933: _low_level_execute_command(): starting 7530 1727096043.01936: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053 `" && echo ansible-tmp-1727096043.0181544-8807-184784053976053="` echo /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053 `" ) && sleep 0' 7530 1727096043.02530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096043.02547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096043.02562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096043.02584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096043.02704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096043.02721: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096043.02745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096043.02822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096043.04871: stdout chunk (state=3): >>>ansible-tmp-1727096043.0181544-8807-184784053976053=/root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053 <<< 7530 1727096043.05020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096043.05038: stderr chunk (state=3): >>><<< 7530 1727096043.05057: stdout chunk (state=3): >>><<< 7530 1727096043.05074: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096043.0181544-8807-184784053976053=/root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096043.05273: variable 'ansible_module_compression' from source: unknown 7530 1727096043.05277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7530 1727096043.05279: variable 'ansible_facts' from source: unknown 7530 1727096043.05336: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/AnsiballZ_service_facts.py 7530 1727096043.05526: Sending initial data 7530 1727096043.05539: Sent initial data (160 bytes) 7530 1727096043.06142: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096043.06190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096043.06202: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7530 1727096043.06214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096043.06298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096043.06344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096043.06388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096043.08087: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096043.08141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096043.08158: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpbns8gdwr /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/AnsiballZ_service_facts.py <<< 7530 1727096043.08161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/AnsiballZ_service_facts.py" <<< 7530 1727096043.08234: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpbns8gdwr" to remote "/root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/AnsiballZ_service_facts.py" <<< 7530 1727096043.09035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096043.09075: stderr chunk (state=3): >>><<< 7530 1727096043.09078: stdout chunk (state=3): >>><<< 7530 1727096043.09169: done transferring module to remote 7530 1727096043.09275: _low_level_execute_command(): starting 7530 1727096043.09279: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/ /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/AnsiballZ_service_facts.py && sleep 0' 7530 1727096043.09877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096043.09891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096043.09905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096043.09951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096043.09963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096043.09982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096043.10060: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096043.10084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096043.10099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096043.10122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096043.10192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096043.12150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096043.12167: stdout chunk (state=3): >>><<< 7530 1727096043.12191: stderr chunk (state=3): >>><<< 7530 1727096043.12293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096043.12296: _low_level_execute_command(): starting 7530 1727096043.12299: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/AnsiballZ_service_facts.py && sleep 0' 7530 1727096043.12908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096043.12940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096043.12943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096043.13058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096043.13082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096043.13171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096044.79496: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 7530 1727096044.79553: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_sta<<< 7530 1727096044.79578: stdout chunk (state=3): >>>t.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7530 1727096044.81273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096044.81288: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096044.81374: stdout chunk (state=3): >>><<< 7530 1727096044.81378: stderr chunk (state=3): >>><<< 7530 1727096044.81385: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096044.82469: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096044.82493: _low_level_execute_command(): starting 7530 1727096044.82502: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096043.0181544-8807-184784053976053/ > /dev/null 2>&1 && sleep 0' 7530 1727096044.83170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096044.83187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096044.83202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096044.83218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096044.83245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096044.83283: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096044.83377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096044.83387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096044.83436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096044.83481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096044.85432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096044.85437: stdout chunk (state=3): >>><<< 7530 1727096044.85440: stderr chunk (state=3): >>><<< 7530 1727096044.85458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096044.85473: handler run complete 7530 1727096044.85873: variable 'ansible_facts' from source: unknown 7530 1727096044.85878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096044.86377: variable 'ansible_facts' from source: unknown 7530 1727096044.86555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096044.86781: attempt loop complete, returning result 7530 1727096044.86793: _execute() done 7530 1727096044.86800: dumping result to json 7530 1727096044.86872: done dumping result, returning 7530 1727096044.86886: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-086b-f4f0-000000001386] 7530 1727096044.86896: sending task result for task 0afff68d-5257-086b-f4f0-000000001386 7530 1727096044.88329: done sending task result for task 0afff68d-5257-086b-f4f0-000000001386 7530 1727096044.88336: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096044.88445: no more pending results, returning what we have 7530 1727096044.88448: results queue empty 7530 1727096044.88449: checking for any_errors_fatal 7530 1727096044.88453: done checking for any_errors_fatal 7530 1727096044.88454: checking for max_fail_percentage 7530 1727096044.88455: done checking for max_fail_percentage 7530 1727096044.88456: checking to see if all hosts have failed and the running result is not ok 7530 1727096044.88457: done checking to see if all hosts have failed 7530 1727096044.88458: getting the remaining hosts for this loop 7530 1727096044.88459: done getting the remaining hosts for this loop 7530 1727096044.88462: getting the next task for host managed_node3 7530 1727096044.88471: done getting next task for host managed_node3 7530 1727096044.88475: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096044.88479: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096044.88489: getting variables 7530 1727096044.88490: in VariableManager get_vars() 7530 1727096044.88539: Calling all_inventory to load vars for managed_node3 7530 1727096044.88542: Calling groups_inventory to load vars for managed_node3 7530 1727096044.88545: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096044.88554: Calling all_plugins_play to load vars for managed_node3 7530 1727096044.88557: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096044.88560: Calling groups_plugins_play to load vars for managed_node3 7530 1727096044.89887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096044.91904: done with get_vars() 7530 1727096044.91928: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:54:04 -0400 (0:00:01.947) 0:00:35.708 ****** 7530 1727096044.92044: entering _queue_task() for managed_node3/package_facts 7530 1727096044.92396: worker is 1 (out of 1 available) 7530 1727096044.92408: exiting _queue_task() for managed_node3/package_facts 7530 1727096044.92420: done queuing things up, now waiting for results queue to drain 7530 1727096044.92422: waiting for pending results... 7530 1727096044.92798: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096044.92975: in run() - task 0afff68d-5257-086b-f4f0-000000001387 7530 1727096044.92979: variable 'ansible_search_path' from source: unknown 7530 1727096044.92982: variable 'ansible_search_path' from source: unknown 7530 1727096044.92998: calling self._execute() 7530 1727096044.93110: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096044.93238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096044.93243: variable 'omit' from source: magic vars 7530 1727096044.93557: variable 'ansible_distribution_major_version' from source: facts 7530 1727096044.93583: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096044.93593: variable 'omit' from source: magic vars 7530 1727096044.93686: variable 'omit' from source: magic vars 7530 1727096044.93776: variable 'omit' from source: magic vars 7530 1727096044.93796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096044.93845: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096044.93875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096044.93910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096044.93928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096044.93972: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096044.94003: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096044.94006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096044.94116: Set connection var ansible_pipelining to False 7530 1727096044.94221: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096044.94224: Set connection var ansible_timeout to 10 7530 1727096044.94227: Set connection var ansible_shell_executable to /bin/sh 7530 1727096044.94232: Set connection var ansible_shell_type to sh 7530 1727096044.94235: Set connection var ansible_connection to ssh 7530 1727096044.94237: variable 'ansible_shell_executable' from source: unknown 7530 1727096044.94240: variable 'ansible_connection' from source: unknown 7530 1727096044.94243: variable 'ansible_module_compression' from source: unknown 7530 1727096044.94245: variable 'ansible_shell_type' from source: unknown 7530 1727096044.94247: variable 'ansible_shell_executable' from source: unknown 7530 1727096044.94249: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096044.94251: variable 'ansible_pipelining' from source: unknown 7530 1727096044.94253: variable 'ansible_timeout' from source: unknown 7530 1727096044.94255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096044.94481: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096044.94500: variable 'omit' from source: magic vars 7530 1727096044.94510: starting attempt loop 7530 1727096044.94517: running the handler 7530 1727096044.94547: _low_level_execute_command(): starting 7530 1727096044.94565: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096044.95400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096044.95472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096044.95494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096044.95536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096044.95605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096044.97290: stdout chunk (state=3): >>>/root <<< 7530 1727096044.97438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096044.97454: stderr chunk (state=3): >>><<< 7530 1727096044.97463: stdout chunk (state=3): >>><<< 7530 1727096044.97501: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096044.97618: _low_level_execute_command(): starting 7530 1727096044.97623: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612 `" && echo ansible-tmp-1727096044.9751532-8854-102664908725612="` echo /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612 `" ) && sleep 0' 7530 1727096044.98220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096044.98238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096044.98255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096044.98276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096044.98339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096044.98404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096044.98435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096044.98458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096044.98540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096045.00532: stdout chunk (state=3): >>>ansible-tmp-1727096044.9751532-8854-102664908725612=/root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612 <<< 7530 1727096045.00781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096045.00786: stdout chunk (state=3): >>><<< 7530 1727096045.00788: stderr chunk (state=3): >>><<< 7530 1727096045.00791: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096044.9751532-8854-102664908725612=/root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096045.00793: variable 'ansible_module_compression' from source: unknown 7530 1727096045.00845: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7530 1727096045.00923: variable 'ansible_facts' from source: unknown 7530 1727096045.01137: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/AnsiballZ_package_facts.py 7530 1727096045.01345: Sending initial data 7530 1727096045.01348: Sent initial data (160 bytes) 7530 1727096045.01951: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096045.01986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096045.02085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096045.02111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096045.02127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096045.02189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096045.03906: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096045.03975: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096045.04036: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpts86uyu5 /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/AnsiballZ_package_facts.py <<< 7530 1727096045.04045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/AnsiballZ_package_facts.py" <<< 7530 1727096045.04090: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpts86uyu5" to remote "/root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/AnsiballZ_package_facts.py" <<< 7530 1727096045.05683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096045.05699: stderr chunk (state=3): >>><<< 7530 1727096045.05832: stdout chunk (state=3): >>><<< 7530 1727096045.05836: done transferring module to remote 7530 1727096045.05839: _low_level_execute_command(): starting 7530 1727096045.05841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/ /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/AnsiballZ_package_facts.py && sleep 0' 7530 1727096045.06437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096045.06452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096045.06483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096045.06522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096045.06579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096045.06593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096045.06645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096045.06676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096045.06699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096045.06773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096045.08676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096045.08705: stderr chunk (state=3): >>><<< 7530 1727096045.08713: stdout chunk (state=3): >>><<< 7530 1727096045.08735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096045.08743: _low_level_execute_command(): starting 7530 1727096045.08753: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/AnsiballZ_package_facts.py && sleep 0' 7530 1727096045.09395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096045.09416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096045.09434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096045.09451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096045.09470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096045.09482: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096045.09495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096045.09541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096045.09604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096045.09624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096045.09655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096045.09743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096045.56117: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 7530 1727096045.56225: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 7530 1727096045.56302: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 7530 1727096045.56350: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7530 1727096045.58432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096045.58471: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096045.58474: stdout chunk (state=3): >>><<< 7530 1727096045.58476: stderr chunk (state=3): >>><<< 7530 1727096045.58778: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096045.60704: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096045.60708: _low_level_execute_command(): starting 7530 1727096045.60710: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096044.9751532-8854-102664908725612/ > /dev/null 2>&1 && sleep 0' 7530 1727096045.61255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096045.61275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096045.61288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096045.61305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096045.61387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096045.61419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096045.61437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096045.61459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096045.61531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096045.63444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096045.63521: stderr chunk (state=3): >>><<< 7530 1727096045.63674: stdout chunk (state=3): >>><<< 7530 1727096045.63678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096045.63681: handler run complete 7530 1727096045.64581: variable 'ansible_facts' from source: unknown 7530 1727096045.65065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096045.67136: variable 'ansible_facts' from source: unknown 7530 1727096045.67611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096045.68372: attempt loop complete, returning result 7530 1727096045.68473: _execute() done 7530 1727096045.68477: dumping result to json 7530 1727096045.68649: done dumping result, returning 7530 1727096045.68664: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-086b-f4f0-000000001387] 7530 1727096045.68676: sending task result for task 0afff68d-5257-086b-f4f0-000000001387 7530 1727096045.71263: done sending task result for task 0afff68d-5257-086b-f4f0-000000001387 7530 1727096045.71268: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096045.71423: no more pending results, returning what we have 7530 1727096045.71426: results queue empty 7530 1727096045.71427: checking for any_errors_fatal 7530 1727096045.71436: done checking for any_errors_fatal 7530 1727096045.71437: checking for max_fail_percentage 7530 1727096045.71438: done checking for max_fail_percentage 7530 1727096045.71439: checking to see if all hosts have failed and the running result is not ok 7530 1727096045.71440: done checking to see if all hosts have failed 7530 1727096045.71441: getting the remaining hosts for this loop 7530 1727096045.71442: done getting the remaining hosts for this loop 7530 1727096045.71445: getting the next task for host managed_node3 7530 1727096045.71459: done getting next task for host managed_node3 7530 1727096045.71463: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096045.71466: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096045.71480: getting variables 7530 1727096045.71482: in VariableManager get_vars() 7530 1727096045.71523: Calling all_inventory to load vars for managed_node3 7530 1727096045.71526: Calling groups_inventory to load vars for managed_node3 7530 1727096045.71528: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096045.71541: Calling all_plugins_play to load vars for managed_node3 7530 1727096045.71544: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096045.71547: Calling groups_plugins_play to load vars for managed_node3 7530 1727096045.72884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096045.74592: done with get_vars() 7530 1727096045.74636: done getting variables 7530 1727096045.74704: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:54:05 -0400 (0:00:00.827) 0:00:36.535 ****** 7530 1727096045.74755: entering _queue_task() for managed_node3/debug 7530 1727096045.75076: worker is 1 (out of 1 available) 7530 1727096045.75090: exiting _queue_task() for managed_node3/debug 7530 1727096045.75103: done queuing things up, now waiting for results queue to drain 7530 1727096045.75104: waiting for pending results... 7530 1727096045.75286: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096045.75387: in run() - task 0afff68d-5257-086b-f4f0-0000000000b9 7530 1727096045.75401: variable 'ansible_search_path' from source: unknown 7530 1727096045.75405: variable 'ansible_search_path' from source: unknown 7530 1727096045.75436: calling self._execute() 7530 1727096045.75515: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096045.75520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096045.75529: variable 'omit' from source: magic vars 7530 1727096045.75817: variable 'ansible_distribution_major_version' from source: facts 7530 1727096045.75828: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096045.75835: variable 'omit' from source: magic vars 7530 1727096045.75877: variable 'omit' from source: magic vars 7530 1727096045.75950: variable 'network_provider' from source: set_fact 7530 1727096045.75966: variable 'omit' from source: magic vars 7530 1727096045.76003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096045.76030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096045.76048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096045.76062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096045.76074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096045.76101: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096045.76104: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096045.76106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096045.76178: Set connection var ansible_pipelining to False 7530 1727096045.76183: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096045.76188: Set connection var ansible_timeout to 10 7530 1727096045.76198: Set connection var ansible_shell_executable to /bin/sh 7530 1727096045.76201: Set connection var ansible_shell_type to sh 7530 1727096045.76205: Set connection var ansible_connection to ssh 7530 1727096045.76223: variable 'ansible_shell_executable' from source: unknown 7530 1727096045.76226: variable 'ansible_connection' from source: unknown 7530 1727096045.76228: variable 'ansible_module_compression' from source: unknown 7530 1727096045.76230: variable 'ansible_shell_type' from source: unknown 7530 1727096045.76232: variable 'ansible_shell_executable' from source: unknown 7530 1727096045.76238: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096045.76240: variable 'ansible_pipelining' from source: unknown 7530 1727096045.76244: variable 'ansible_timeout' from source: unknown 7530 1727096045.76248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096045.76356: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096045.76365: variable 'omit' from source: magic vars 7530 1727096045.76372: starting attempt loop 7530 1727096045.76375: running the handler 7530 1727096045.76416: handler run complete 7530 1727096045.76427: attempt loop complete, returning result 7530 1727096045.76430: _execute() done 7530 1727096045.76432: dumping result to json 7530 1727096045.76436: done dumping result, returning 7530 1727096045.76443: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-086b-f4f0-0000000000b9] 7530 1727096045.76448: sending task result for task 0afff68d-5257-086b-f4f0-0000000000b9 7530 1727096045.76533: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000b9 7530 1727096045.76536: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7530 1727096045.76624: no more pending results, returning what we have 7530 1727096045.76628: results queue empty 7530 1727096045.76628: checking for any_errors_fatal 7530 1727096045.76637: done checking for any_errors_fatal 7530 1727096045.76638: checking for max_fail_percentage 7530 1727096045.76640: done checking for max_fail_percentage 7530 1727096045.76641: checking to see if all hosts have failed and the running result is not ok 7530 1727096045.76642: done checking to see if all hosts have failed 7530 1727096045.76642: getting the remaining hosts for this loop 7530 1727096045.76646: done getting the remaining hosts for this loop 7530 1727096045.76650: getting the next task for host managed_node3 7530 1727096045.76664: done getting next task for host managed_node3 7530 1727096045.76669: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096045.76672: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096045.76684: getting variables 7530 1727096045.76685: in VariableManager get_vars() 7530 1727096045.76728: Calling all_inventory to load vars for managed_node3 7530 1727096045.76731: Calling groups_inventory to load vars for managed_node3 7530 1727096045.76733: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096045.76742: Calling all_plugins_play to load vars for managed_node3 7530 1727096045.76744: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096045.76747: Calling groups_plugins_play to load vars for managed_node3 7530 1727096045.78085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096045.78966: done with get_vars() 7530 1727096045.78991: done getting variables 7530 1727096045.79040: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:54:05 -0400 (0:00:00.043) 0:00:36.579 ****** 7530 1727096045.79066: entering _queue_task() for managed_node3/fail 7530 1727096045.79327: worker is 1 (out of 1 available) 7530 1727096045.79342: exiting _queue_task() for managed_node3/fail 7530 1727096045.79353: done queuing things up, now waiting for results queue to drain 7530 1727096045.79355: waiting for pending results... 7530 1727096045.79540: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096045.79640: in run() - task 0afff68d-5257-086b-f4f0-0000000000ba 7530 1727096045.79652: variable 'ansible_search_path' from source: unknown 7530 1727096045.79656: variable 'ansible_search_path' from source: unknown 7530 1727096045.79689: calling self._execute() 7530 1727096045.79760: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096045.79764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096045.79775: variable 'omit' from source: magic vars 7530 1727096045.80173: variable 'ansible_distribution_major_version' from source: facts 7530 1727096045.80176: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096045.80288: variable 'network_state' from source: role '' defaults 7530 1727096045.80303: Evaluated conditional (network_state != {}): False 7530 1727096045.80310: when evaluation is False, skipping this task 7530 1727096045.80316: _execute() done 7530 1727096045.80322: dumping result to json 7530 1727096045.80328: done dumping result, returning 7530 1727096045.80340: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-086b-f4f0-0000000000ba] 7530 1727096045.80349: sending task result for task 0afff68d-5257-086b-f4f0-0000000000ba 7530 1727096045.80555: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000ba 7530 1727096045.80559: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096045.80618: no more pending results, returning what we have 7530 1727096045.80622: results queue empty 7530 1727096045.80622: checking for any_errors_fatal 7530 1727096045.80633: done checking for any_errors_fatal 7530 1727096045.80634: checking for max_fail_percentage 7530 1727096045.80636: done checking for max_fail_percentage 7530 1727096045.80637: checking to see if all hosts have failed and the running result is not ok 7530 1727096045.80638: done checking to see if all hosts have failed 7530 1727096045.80639: getting the remaining hosts for this loop 7530 1727096045.80641: done getting the remaining hosts for this loop 7530 1727096045.80644: getting the next task for host managed_node3 7530 1727096045.80651: done getting next task for host managed_node3 7530 1727096045.80655: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096045.80658: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096045.80682: getting variables 7530 1727096045.80683: in VariableManager get_vars() 7530 1727096045.80734: Calling all_inventory to load vars for managed_node3 7530 1727096045.80737: Calling groups_inventory to load vars for managed_node3 7530 1727096045.80739: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096045.80748: Calling all_plugins_play to load vars for managed_node3 7530 1727096045.80750: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096045.80753: Calling groups_plugins_play to load vars for managed_node3 7530 1727096045.82006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096045.83000: done with get_vars() 7530 1727096045.83020: done getting variables 7530 1727096045.83070: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:54:05 -0400 (0:00:00.040) 0:00:36.619 ****** 7530 1727096045.83096: entering _queue_task() for managed_node3/fail 7530 1727096045.83364: worker is 1 (out of 1 available) 7530 1727096045.83387: exiting _queue_task() for managed_node3/fail 7530 1727096045.83400: done queuing things up, now waiting for results queue to drain 7530 1727096045.83402: waiting for pending results... 7530 1727096045.83784: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096045.83828: in run() - task 0afff68d-5257-086b-f4f0-0000000000bb 7530 1727096045.83852: variable 'ansible_search_path' from source: unknown 7530 1727096045.83860: variable 'ansible_search_path' from source: unknown 7530 1727096045.83973: calling self._execute() 7530 1727096045.84033: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096045.84045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096045.84062: variable 'omit' from source: magic vars 7530 1727096045.84465: variable 'ansible_distribution_major_version' from source: facts 7530 1727096045.84484: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096045.84613: variable 'network_state' from source: role '' defaults 7530 1727096045.84628: Evaluated conditional (network_state != {}): False 7530 1727096045.84639: when evaluation is False, skipping this task 7530 1727096045.84647: _execute() done 7530 1727096045.84657: dumping result to json 7530 1727096045.84674: done dumping result, returning 7530 1727096045.84684: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-086b-f4f0-0000000000bb] 7530 1727096045.84689: sending task result for task 0afff68d-5257-086b-f4f0-0000000000bb skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096045.84825: no more pending results, returning what we have 7530 1727096045.84829: results queue empty 7530 1727096045.84830: checking for any_errors_fatal 7530 1727096045.84843: done checking for any_errors_fatal 7530 1727096045.84844: checking for max_fail_percentage 7530 1727096045.84845: done checking for max_fail_percentage 7530 1727096045.84846: checking to see if all hosts have failed and the running result is not ok 7530 1727096045.84847: done checking to see if all hosts have failed 7530 1727096045.84848: getting the remaining hosts for this loop 7530 1727096045.84849: done getting the remaining hosts for this loop 7530 1727096045.84853: getting the next task for host managed_node3 7530 1727096045.84859: done getting next task for host managed_node3 7530 1727096045.84862: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096045.84865: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096045.84891: getting variables 7530 1727096045.84893: in VariableManager get_vars() 7530 1727096045.84947: Calling all_inventory to load vars for managed_node3 7530 1727096045.84951: Calling groups_inventory to load vars for managed_node3 7530 1727096045.84953: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096045.84973: Calling all_plugins_play to load vars for managed_node3 7530 1727096045.84978: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096045.84983: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000bb 7530 1727096045.84986: WORKER PROCESS EXITING 7530 1727096045.84990: Calling groups_plugins_play to load vars for managed_node3 7530 1727096045.86280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096045.87192: done with get_vars() 7530 1727096045.87217: done getting variables 7530 1727096045.87264: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:54:05 -0400 (0:00:00.041) 0:00:36.661 ****** 7530 1727096045.87294: entering _queue_task() for managed_node3/fail 7530 1727096045.87557: worker is 1 (out of 1 available) 7530 1727096045.87573: exiting _queue_task() for managed_node3/fail 7530 1727096045.87585: done queuing things up, now waiting for results queue to drain 7530 1727096045.87587: waiting for pending results... 7530 1727096045.87774: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096045.87864: in run() - task 0afff68d-5257-086b-f4f0-0000000000bc 7530 1727096045.87881: variable 'ansible_search_path' from source: unknown 7530 1727096045.87885: variable 'ansible_search_path' from source: unknown 7530 1727096045.87913: calling self._execute() 7530 1727096045.87995: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096045.87999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096045.88009: variable 'omit' from source: magic vars 7530 1727096045.88373: variable 'ansible_distribution_major_version' from source: facts 7530 1727096045.88376: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096045.88603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096045.90366: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096045.90416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096045.90445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096045.90472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096045.90493: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096045.90555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.90577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.90594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.90625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.90638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.90710: variable 'ansible_distribution_major_version' from source: facts 7530 1727096045.90725: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7530 1727096045.90809: variable 'ansible_distribution' from source: facts 7530 1727096045.90813: variable '__network_rh_distros' from source: role '' defaults 7530 1727096045.90823: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7530 1727096045.90986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.91004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.91022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.91052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.91063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.91097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.91113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.91130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.91158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.91175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.91204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.91220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.91236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.91260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.91277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.91488: variable 'network_connections' from source: task vars 7530 1727096045.91499: variable 'interface' from source: play vars 7530 1727096045.91547: variable 'interface' from source: play vars 7530 1727096045.91559: variable 'network_state' from source: role '' defaults 7530 1727096045.91611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096045.91734: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096045.91761: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096045.91785: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096045.91812: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096045.91844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096045.91859: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096045.91882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.91951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096045.92035: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7530 1727096045.92040: when evaluation is False, skipping this task 7530 1727096045.92172: _execute() done 7530 1727096045.92177: dumping result to json 7530 1727096045.92180: done dumping result, returning 7530 1727096045.92185: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-086b-f4f0-0000000000bc] 7530 1727096045.92188: sending task result for task 0afff68d-5257-086b-f4f0-0000000000bc 7530 1727096045.92256: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000bc 7530 1727096045.92258: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7530 1727096045.92311: no more pending results, returning what we have 7530 1727096045.92314: results queue empty 7530 1727096045.92315: checking for any_errors_fatal 7530 1727096045.92324: done checking for any_errors_fatal 7530 1727096045.92324: checking for max_fail_percentage 7530 1727096045.92326: done checking for max_fail_percentage 7530 1727096045.92327: checking to see if all hosts have failed and the running result is not ok 7530 1727096045.92328: done checking to see if all hosts have failed 7530 1727096045.92329: getting the remaining hosts for this loop 7530 1727096045.92332: done getting the remaining hosts for this loop 7530 1727096045.92336: getting the next task for host managed_node3 7530 1727096045.92342: done getting next task for host managed_node3 7530 1727096045.92345: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096045.92348: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096045.92365: getting variables 7530 1727096045.92369: in VariableManager get_vars() 7530 1727096045.92412: Calling all_inventory to load vars for managed_node3 7530 1727096045.92415: Calling groups_inventory to load vars for managed_node3 7530 1727096045.92417: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096045.92425: Calling all_plugins_play to load vars for managed_node3 7530 1727096045.92427: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096045.92430: Calling groups_plugins_play to load vars for managed_node3 7530 1727096045.93356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096045.94455: done with get_vars() 7530 1727096045.94488: done getting variables 7530 1727096045.94548: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:54:05 -0400 (0:00:00.072) 0:00:36.734 ****** 7530 1727096045.94583: entering _queue_task() for managed_node3/dnf 7530 1727096045.94918: worker is 1 (out of 1 available) 7530 1727096045.94930: exiting _queue_task() for managed_node3/dnf 7530 1727096045.94943: done queuing things up, now waiting for results queue to drain 7530 1727096045.94945: waiting for pending results... 7530 1727096045.95296: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096045.95405: in run() - task 0afff68d-5257-086b-f4f0-0000000000bd 7530 1727096045.95428: variable 'ansible_search_path' from source: unknown 7530 1727096045.95573: variable 'ansible_search_path' from source: unknown 7530 1727096045.95577: calling self._execute() 7530 1727096045.95579: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096045.95592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096045.95606: variable 'omit' from source: magic vars 7530 1727096045.95986: variable 'ansible_distribution_major_version' from source: facts 7530 1727096045.96003: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096045.96215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096045.98192: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096045.98239: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096045.98270: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096045.98295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096045.98318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096045.98380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.98400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.98422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.98448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.98459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.98549: variable 'ansible_distribution' from source: facts 7530 1727096045.98553: variable 'ansible_distribution_major_version' from source: facts 7530 1727096045.98566: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7530 1727096045.98649: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096045.98736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.98754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.98772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.98797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.98807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.98836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.98856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.98873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.98897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.98908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.98936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096045.98951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096045.98973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.98998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096045.99008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096045.99115: variable 'network_connections' from source: task vars 7530 1727096045.99126: variable 'interface' from source: play vars 7530 1727096045.99174: variable 'interface' from source: play vars 7530 1727096045.99228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096045.99362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096045.99397: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096045.99572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096045.99576: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096045.99578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096045.99580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096045.99591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096045.99593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096045.99645: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096045.99888: variable 'network_connections' from source: task vars 7530 1727096045.99898: variable 'interface' from source: play vars 7530 1727096045.99962: variable 'interface' from source: play vars 7530 1727096046.00004: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096046.00011: when evaluation is False, skipping this task 7530 1727096046.00018: _execute() done 7530 1727096046.00024: dumping result to json 7530 1727096046.00030: done dumping result, returning 7530 1727096046.00042: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-0000000000bd] 7530 1727096046.00051: sending task result for task 0afff68d-5257-086b-f4f0-0000000000bd 7530 1727096046.00157: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000bd 7530 1727096046.00159: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096046.00244: no more pending results, returning what we have 7530 1727096046.00248: results queue empty 7530 1727096046.00248: checking for any_errors_fatal 7530 1727096046.00255: done checking for any_errors_fatal 7530 1727096046.00256: checking for max_fail_percentage 7530 1727096046.00258: done checking for max_fail_percentage 7530 1727096046.00259: checking to see if all hosts have failed and the running result is not ok 7530 1727096046.00260: done checking to see if all hosts have failed 7530 1727096046.00261: getting the remaining hosts for this loop 7530 1727096046.00262: done getting the remaining hosts for this loop 7530 1727096046.00265: getting the next task for host managed_node3 7530 1727096046.00274: done getting next task for host managed_node3 7530 1727096046.00278: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096046.00280: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096046.00299: getting variables 7530 1727096046.00300: in VariableManager get_vars() 7530 1727096046.00347: Calling all_inventory to load vars for managed_node3 7530 1727096046.00350: Calling groups_inventory to load vars for managed_node3 7530 1727096046.00352: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096046.00361: Calling all_plugins_play to load vars for managed_node3 7530 1727096046.00363: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096046.00366: Calling groups_plugins_play to load vars for managed_node3 7530 1727096046.01723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096046.03315: done with get_vars() 7530 1727096046.03349: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096046.03431: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:54:06 -0400 (0:00:00.088) 0:00:36.822 ****** 7530 1727096046.03465: entering _queue_task() for managed_node3/yum 7530 1727096046.03814: worker is 1 (out of 1 available) 7530 1727096046.03828: exiting _queue_task() for managed_node3/yum 7530 1727096046.03841: done queuing things up, now waiting for results queue to drain 7530 1727096046.03842: waiting for pending results... 7530 1727096046.04197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096046.04476: in run() - task 0afff68d-5257-086b-f4f0-0000000000be 7530 1727096046.04480: variable 'ansible_search_path' from source: unknown 7530 1727096046.04483: variable 'ansible_search_path' from source: unknown 7530 1727096046.04486: calling self._execute() 7530 1727096046.04488: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096046.04491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096046.04493: variable 'omit' from source: magic vars 7530 1727096046.04857: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.04877: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096046.05061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096046.07643: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096046.07717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096046.07765: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096046.07805: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096046.07840: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096046.07923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.07961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.07993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.08036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.08059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.08171: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.08193: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7530 1727096046.08200: when evaluation is False, skipping this task 7530 1727096046.08209: _execute() done 7530 1727096046.08216: dumping result to json 7530 1727096046.08225: done dumping result, returning 7530 1727096046.08236: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-0000000000be] 7530 1727096046.08245: sending task result for task 0afff68d-5257-086b-f4f0-0000000000be skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7530 1727096046.08427: no more pending results, returning what we have 7530 1727096046.08431: results queue empty 7530 1727096046.08432: checking for any_errors_fatal 7530 1727096046.08440: done checking for any_errors_fatal 7530 1727096046.08440: checking for max_fail_percentage 7530 1727096046.08442: done checking for max_fail_percentage 7530 1727096046.08443: checking to see if all hosts have failed and the running result is not ok 7530 1727096046.08445: done checking to see if all hosts have failed 7530 1727096046.08445: getting the remaining hosts for this loop 7530 1727096046.08447: done getting the remaining hosts for this loop 7530 1727096046.08451: getting the next task for host managed_node3 7530 1727096046.08458: done getting next task for host managed_node3 7530 1727096046.08462: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096046.08465: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096046.08490: getting variables 7530 1727096046.08492: in VariableManager get_vars() 7530 1727096046.08544: Calling all_inventory to load vars for managed_node3 7530 1727096046.08547: Calling groups_inventory to load vars for managed_node3 7530 1727096046.08549: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096046.08560: Calling all_plugins_play to load vars for managed_node3 7530 1727096046.08564: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096046.08566: Calling groups_plugins_play to load vars for managed_node3 7530 1727096046.09583: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000be 7530 1727096046.09588: WORKER PROCESS EXITING 7530 1727096046.16372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096046.17899: done with get_vars() 7530 1727096046.17931: done getting variables 7530 1727096046.17980: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:54:06 -0400 (0:00:00.145) 0:00:36.968 ****** 7530 1727096046.18012: entering _queue_task() for managed_node3/fail 7530 1727096046.18339: worker is 1 (out of 1 available) 7530 1727096046.18351: exiting _queue_task() for managed_node3/fail 7530 1727096046.18363: done queuing things up, now waiting for results queue to drain 7530 1727096046.18364: waiting for pending results... 7530 1727096046.18662: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096046.18813: in run() - task 0afff68d-5257-086b-f4f0-0000000000bf 7530 1727096046.18834: variable 'ansible_search_path' from source: unknown 7530 1727096046.18843: variable 'ansible_search_path' from source: unknown 7530 1727096046.18888: calling self._execute() 7530 1727096046.19001: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096046.19019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096046.19034: variable 'omit' from source: magic vars 7530 1727096046.19431: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.19454: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096046.19583: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096046.19787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096046.22677: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096046.22682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096046.22684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096046.22718: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096046.22777: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096046.22962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.23051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.23174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.23282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.23301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.23399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.23482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.23598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.23645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.23692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.23827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.23860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.24107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.24111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.24114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.24480: variable 'network_connections' from source: task vars 7530 1727096046.24525: variable 'interface' from source: play vars 7530 1727096046.24665: variable 'interface' from source: play vars 7530 1727096046.24756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096046.24953: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096046.25001: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096046.25038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096046.25071: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096046.25119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096046.25147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096046.25177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.25273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096046.25286: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096046.25561: variable 'network_connections' from source: task vars 7530 1727096046.25575: variable 'interface' from source: play vars 7530 1727096046.25650: variable 'interface' from source: play vars 7530 1727096046.25694: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096046.25703: when evaluation is False, skipping this task 7530 1727096046.25712: _execute() done 7530 1727096046.25719: dumping result to json 7530 1727096046.25726: done dumping result, returning 7530 1727096046.25746: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-0000000000bf] 7530 1727096046.25758: sending task result for task 0afff68d-5257-086b-f4f0-0000000000bf 7530 1727096046.25918: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000bf 7530 1727096046.25921: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096046.26006: no more pending results, returning what we have 7530 1727096046.26010: results queue empty 7530 1727096046.26011: checking for any_errors_fatal 7530 1727096046.26021: done checking for any_errors_fatal 7530 1727096046.26022: checking for max_fail_percentage 7530 1727096046.26024: done checking for max_fail_percentage 7530 1727096046.26025: checking to see if all hosts have failed and the running result is not ok 7530 1727096046.26026: done checking to see if all hosts have failed 7530 1727096046.26026: getting the remaining hosts for this loop 7530 1727096046.26028: done getting the remaining hosts for this loop 7530 1727096046.26034: getting the next task for host managed_node3 7530 1727096046.26042: done getting next task for host managed_node3 7530 1727096046.26047: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7530 1727096046.26050: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096046.26072: getting variables 7530 1727096046.26074: in VariableManager get_vars() 7530 1727096046.26127: Calling all_inventory to load vars for managed_node3 7530 1727096046.26133: Calling groups_inventory to load vars for managed_node3 7530 1727096046.26137: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096046.26148: Calling all_plugins_play to load vars for managed_node3 7530 1727096046.26151: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096046.26155: Calling groups_plugins_play to load vars for managed_node3 7530 1727096046.28824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096046.30489: done with get_vars() 7530 1727096046.30525: done getting variables 7530 1727096046.30594: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:54:06 -0400 (0:00:00.126) 0:00:37.094 ****** 7530 1727096046.30634: entering _queue_task() for managed_node3/package 7530 1727096046.30991: worker is 1 (out of 1 available) 7530 1727096046.31005: exiting _queue_task() for managed_node3/package 7530 1727096046.31017: done queuing things up, now waiting for results queue to drain 7530 1727096046.31018: waiting for pending results... 7530 1727096046.31398: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7530 1727096046.31498: in run() - task 0afff68d-5257-086b-f4f0-0000000000c0 7530 1727096046.31673: variable 'ansible_search_path' from source: unknown 7530 1727096046.31677: variable 'ansible_search_path' from source: unknown 7530 1727096046.31680: calling self._execute() 7530 1727096046.31683: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096046.31685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096046.31695: variable 'omit' from source: magic vars 7530 1727096046.32091: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.32109: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096046.32320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096046.32603: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096046.32656: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096046.32739: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096046.32782: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096046.32912: variable 'network_packages' from source: role '' defaults 7530 1727096046.33036: variable '__network_provider_setup' from source: role '' defaults 7530 1727096046.33053: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096046.33126: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096046.33144: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096046.33211: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096046.33427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096046.36760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096046.37010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096046.37014: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096046.37017: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096046.37143: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096046.37318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.37555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.37559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.37583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.37604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.37714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.37796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.37823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.38091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.38094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.38441: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096046.38735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.38769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.38839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.38962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.38984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.39210: variable 'ansible_python' from source: facts 7530 1727096046.39464: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096046.39469: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096046.39657: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096046.39947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.40000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.40227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.40232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.40235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.40374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.40411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.40444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.40485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.40776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.40845: variable 'network_connections' from source: task vars 7530 1727096046.41101: variable 'interface' from source: play vars 7530 1727096046.41177: variable 'interface' from source: play vars 7530 1727096046.41262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096046.41351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096046.41459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.41495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096046.41588: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096046.42178: variable 'network_connections' from source: task vars 7530 1727096046.42295: variable 'interface' from source: play vars 7530 1727096046.42492: variable 'interface' from source: play vars 7530 1727096046.42563: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096046.42813: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096046.43524: variable 'network_connections' from source: task vars 7530 1727096046.43602: variable 'interface' from source: play vars 7530 1727096046.43680: variable 'interface' from source: play vars 7530 1727096046.43845: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096046.43991: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096046.44558: variable 'network_connections' from source: task vars 7530 1727096046.44693: variable 'interface' from source: play vars 7530 1727096046.44776: variable 'interface' from source: play vars 7530 1727096046.45036: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096046.45166: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096046.45198: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096046.45291: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096046.45848: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096046.47012: variable 'network_connections' from source: task vars 7530 1727096046.47060: variable 'interface' from source: play vars 7530 1727096046.47198: variable 'interface' from source: play vars 7530 1727096046.47281: variable 'ansible_distribution' from source: facts 7530 1727096046.47289: variable '__network_rh_distros' from source: role '' defaults 7530 1727096046.47298: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.47365: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096046.47743: variable 'ansible_distribution' from source: facts 7530 1727096046.47820: variable '__network_rh_distros' from source: role '' defaults 7530 1727096046.47833: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.47851: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096046.48275: variable 'ansible_distribution' from source: facts 7530 1727096046.48470: variable '__network_rh_distros' from source: role '' defaults 7530 1727096046.48474: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.48476: variable 'network_provider' from source: set_fact 7530 1727096046.48478: variable 'ansible_facts' from source: unknown 7530 1727096046.49984: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7530 1727096046.50165: when evaluation is False, skipping this task 7530 1727096046.50170: _execute() done 7530 1727096046.50173: dumping result to json 7530 1727096046.50175: done dumping result, returning 7530 1727096046.50178: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-086b-f4f0-0000000000c0] 7530 1727096046.50181: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c0 7530 1727096046.50257: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c0 7530 1727096046.50261: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7530 1727096046.50318: no more pending results, returning what we have 7530 1727096046.50322: results queue empty 7530 1727096046.50323: checking for any_errors_fatal 7530 1727096046.50333: done checking for any_errors_fatal 7530 1727096046.50334: checking for max_fail_percentage 7530 1727096046.50337: done checking for max_fail_percentage 7530 1727096046.50338: checking to see if all hosts have failed and the running result is not ok 7530 1727096046.50339: done checking to see if all hosts have failed 7530 1727096046.50339: getting the remaining hosts for this loop 7530 1727096046.50341: done getting the remaining hosts for this loop 7530 1727096046.50345: getting the next task for host managed_node3 7530 1727096046.50352: done getting next task for host managed_node3 7530 1727096046.50357: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096046.50360: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096046.50383: getting variables 7530 1727096046.50385: in VariableManager get_vars() 7530 1727096046.50440: Calling all_inventory to load vars for managed_node3 7530 1727096046.50443: Calling groups_inventory to load vars for managed_node3 7530 1727096046.50445: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096046.50456: Calling all_plugins_play to load vars for managed_node3 7530 1727096046.50459: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096046.50463: Calling groups_plugins_play to load vars for managed_node3 7530 1727096046.54319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096046.57525: done with get_vars() 7530 1727096046.57558: done getting variables 7530 1727096046.58030: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:54:06 -0400 (0:00:00.274) 0:00:37.369 ****** 7530 1727096046.58073: entering _queue_task() for managed_node3/package 7530 1727096046.59064: worker is 1 (out of 1 available) 7530 1727096046.59080: exiting _queue_task() for managed_node3/package 7530 1727096046.59093: done queuing things up, now waiting for results queue to drain 7530 1727096046.59095: waiting for pending results... 7530 1727096046.60041: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096046.60274: in run() - task 0afff68d-5257-086b-f4f0-0000000000c1 7530 1727096046.60279: variable 'ansible_search_path' from source: unknown 7530 1727096046.60282: variable 'ansible_search_path' from source: unknown 7530 1727096046.60285: calling self._execute() 7530 1727096046.60630: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096046.60634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096046.60638: variable 'omit' from source: magic vars 7530 1727096046.61874: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.61901: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096046.62148: variable 'network_state' from source: role '' defaults 7530 1727096046.62166: Evaluated conditional (network_state != {}): False 7530 1727096046.62221: when evaluation is False, skipping this task 7530 1727096046.62230: _execute() done 7530 1727096046.62238: dumping result to json 7530 1727096046.62246: done dumping result, returning 7530 1727096046.62260: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-086b-f4f0-0000000000c1] 7530 1727096046.62275: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c1 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096046.62737: no more pending results, returning what we have 7530 1727096046.62741: results queue empty 7530 1727096046.62743: checking for any_errors_fatal 7530 1727096046.62750: done checking for any_errors_fatal 7530 1727096046.62751: checking for max_fail_percentage 7530 1727096046.62753: done checking for max_fail_percentage 7530 1727096046.62754: checking to see if all hosts have failed and the running result is not ok 7530 1727096046.62755: done checking to see if all hosts have failed 7530 1727096046.62756: getting the remaining hosts for this loop 7530 1727096046.62757: done getting the remaining hosts for this loop 7530 1727096046.62762: getting the next task for host managed_node3 7530 1727096046.62771: done getting next task for host managed_node3 7530 1727096046.62776: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096046.62780: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096046.62805: getting variables 7530 1727096046.62807: in VariableManager get_vars() 7530 1727096046.62864: Calling all_inventory to load vars for managed_node3 7530 1727096046.63071: Calling groups_inventory to load vars for managed_node3 7530 1727096046.63075: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096046.63088: Calling all_plugins_play to load vars for managed_node3 7530 1727096046.63091: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096046.63095: Calling groups_plugins_play to load vars for managed_node3 7530 1727096046.63878: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c1 7530 1727096046.63882: WORKER PROCESS EXITING 7530 1727096046.65635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096046.69588: done with get_vars() 7530 1727096046.69618: done getting variables 7530 1727096046.69860: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:54:06 -0400 (0:00:00.118) 0:00:37.487 ****** 7530 1727096046.69896: entering _queue_task() for managed_node3/package 7530 1727096046.70427: worker is 1 (out of 1 available) 7530 1727096046.70441: exiting _queue_task() for managed_node3/package 7530 1727096046.70453: done queuing things up, now waiting for results queue to drain 7530 1727096046.70455: waiting for pending results... 7530 1727096046.70893: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096046.71575: in run() - task 0afff68d-5257-086b-f4f0-0000000000c2 7530 1727096046.71580: variable 'ansible_search_path' from source: unknown 7530 1727096046.71582: variable 'ansible_search_path' from source: unknown 7530 1727096046.71585: calling self._execute() 7530 1727096046.71975: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096046.71979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096046.71982: variable 'omit' from source: magic vars 7530 1727096046.72606: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.72628: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096046.72763: variable 'network_state' from source: role '' defaults 7530 1727096046.72986: Evaluated conditional (network_state != {}): False 7530 1727096046.72994: when evaluation is False, skipping this task 7530 1727096046.73001: _execute() done 7530 1727096046.73009: dumping result to json 7530 1727096046.73015: done dumping result, returning 7530 1727096046.73028: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-086b-f4f0-0000000000c2] 7530 1727096046.73042: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c2 7530 1727096046.73165: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c2 7530 1727096046.73175: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096046.73225: no more pending results, returning what we have 7530 1727096046.73229: results queue empty 7530 1727096046.73230: checking for any_errors_fatal 7530 1727096046.73237: done checking for any_errors_fatal 7530 1727096046.73237: checking for max_fail_percentage 7530 1727096046.73239: done checking for max_fail_percentage 7530 1727096046.73240: checking to see if all hosts have failed and the running result is not ok 7530 1727096046.73241: done checking to see if all hosts have failed 7530 1727096046.73242: getting the remaining hosts for this loop 7530 1727096046.73243: done getting the remaining hosts for this loop 7530 1727096046.73247: getting the next task for host managed_node3 7530 1727096046.73254: done getting next task for host managed_node3 7530 1727096046.73257: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096046.73260: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096046.73286: getting variables 7530 1727096046.73288: in VariableManager get_vars() 7530 1727096046.73341: Calling all_inventory to load vars for managed_node3 7530 1727096046.73344: Calling groups_inventory to load vars for managed_node3 7530 1727096046.73347: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096046.73359: Calling all_plugins_play to load vars for managed_node3 7530 1727096046.73363: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096046.73365: Calling groups_plugins_play to load vars for managed_node3 7530 1727096046.75274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096046.77530: done with get_vars() 7530 1727096046.77569: done getting variables 7530 1727096046.77752: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:54:06 -0400 (0:00:00.079) 0:00:37.566 ****** 7530 1727096046.77857: entering _queue_task() for managed_node3/service 7530 1727096046.78625: worker is 1 (out of 1 available) 7530 1727096046.78640: exiting _queue_task() for managed_node3/service 7530 1727096046.78654: done queuing things up, now waiting for results queue to drain 7530 1727096046.78656: waiting for pending results... 7530 1727096046.79150: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096046.79444: in run() - task 0afff68d-5257-086b-f4f0-0000000000c3 7530 1727096046.79536: variable 'ansible_search_path' from source: unknown 7530 1727096046.79544: variable 'ansible_search_path' from source: unknown 7530 1727096046.79590: calling self._execute() 7530 1727096046.79883: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096046.79896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096046.79914: variable 'omit' from source: magic vars 7530 1727096046.80942: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.80946: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096046.81044: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096046.81594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096046.87045: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096046.87578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096046.87583: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096046.87586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096046.87588: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096046.87840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.87882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.87913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.88017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.88084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.88141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.88299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.88327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.88371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.88427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.88474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096046.88547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096046.88656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.88698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096046.88753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096046.89174: variable 'network_connections' from source: task vars 7530 1727096046.89195: variable 'interface' from source: play vars 7530 1727096046.89277: variable 'interface' from source: play vars 7530 1727096046.89507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096046.89912: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096046.90869: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096046.90899: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096046.90936: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096046.91023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096046.91109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096046.91143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096046.91219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096046.91348: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096046.91899: variable 'network_connections' from source: task vars 7530 1727096046.91963: variable 'interface' from source: play vars 7530 1727096046.92135: variable 'interface' from source: play vars 7530 1727096046.92372: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096046.92376: when evaluation is False, skipping this task 7530 1727096046.92379: _execute() done 7530 1727096046.92381: dumping result to json 7530 1727096046.92384: done dumping result, returning 7530 1727096046.92387: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-0000000000c3] 7530 1727096046.92389: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c3 7530 1727096046.92462: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c3 7530 1727096046.92474: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096046.92521: no more pending results, returning what we have 7530 1727096046.92525: results queue empty 7530 1727096046.92526: checking for any_errors_fatal 7530 1727096046.92533: done checking for any_errors_fatal 7530 1727096046.92534: checking for max_fail_percentage 7530 1727096046.92536: done checking for max_fail_percentage 7530 1727096046.92537: checking to see if all hosts have failed and the running result is not ok 7530 1727096046.92538: done checking to see if all hosts have failed 7530 1727096046.92539: getting the remaining hosts for this loop 7530 1727096046.92540: done getting the remaining hosts for this loop 7530 1727096046.92544: getting the next task for host managed_node3 7530 1727096046.92551: done getting next task for host managed_node3 7530 1727096046.92555: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096046.92558: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096046.92581: getting variables 7530 1727096046.92583: in VariableManager get_vars() 7530 1727096046.92638: Calling all_inventory to load vars for managed_node3 7530 1727096046.92641: Calling groups_inventory to load vars for managed_node3 7530 1727096046.92644: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096046.92655: Calling all_plugins_play to load vars for managed_node3 7530 1727096046.92658: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096046.92661: Calling groups_plugins_play to load vars for managed_node3 7530 1727096046.95784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096046.97534: done with get_vars() 7530 1727096046.97565: done getting variables 7530 1727096046.97627: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:54:06 -0400 (0:00:00.198) 0:00:37.765 ****** 7530 1727096046.97669: entering _queue_task() for managed_node3/service 7530 1727096046.98043: worker is 1 (out of 1 available) 7530 1727096046.98057: exiting _queue_task() for managed_node3/service 7530 1727096046.98073: done queuing things up, now waiting for results queue to drain 7530 1727096046.98075: waiting for pending results... 7530 1727096046.98506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096046.98584: in run() - task 0afff68d-5257-086b-f4f0-0000000000c4 7530 1727096046.98589: variable 'ansible_search_path' from source: unknown 7530 1727096046.98592: variable 'ansible_search_path' from source: unknown 7530 1727096046.98673: calling self._execute() 7530 1727096046.98753: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096046.98764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096046.98781: variable 'omit' from source: magic vars 7530 1727096046.99199: variable 'ansible_distribution_major_version' from source: facts 7530 1727096046.99216: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096046.99457: variable 'network_provider' from source: set_fact 7530 1727096046.99465: variable 'network_state' from source: role '' defaults 7530 1727096046.99469: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7530 1727096046.99472: variable 'omit' from source: magic vars 7530 1727096046.99507: variable 'omit' from source: magic vars 7530 1727096046.99544: variable 'network_service_name' from source: role '' defaults 7530 1727096046.99624: variable 'network_service_name' from source: role '' defaults 7530 1727096046.99743: variable '__network_provider_setup' from source: role '' defaults 7530 1727096046.99755: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096046.99829: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096046.99872: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096046.99921: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096047.00159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096047.02396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096047.02503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096047.02535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096047.02613: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096047.02617: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096047.02700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.02745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.02779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.02873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.02880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.03012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.03015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.03017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.03046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.03074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.03353: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096047.03545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.03548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.03550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.03596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.03616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.03723: variable 'ansible_python' from source: facts 7530 1727096047.03752: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096047.03869: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096047.03962: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096047.04113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.04196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.04199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.04269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.04316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.04422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.04528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.04534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.04565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.04605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.04779: variable 'network_connections' from source: task vars 7530 1727096047.04799: variable 'interface' from source: play vars 7530 1727096047.04908: variable 'interface' from source: play vars 7530 1727096047.05029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096047.05292: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096047.05344: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096047.05401: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096047.05509: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096047.05538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096047.05582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096047.05625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.05666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096047.05735: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096047.06106: variable 'network_connections' from source: task vars 7530 1727096047.06270: variable 'interface' from source: play vars 7530 1727096047.06276: variable 'interface' from source: play vars 7530 1727096047.06313: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096047.06405: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096047.06976: variable 'network_connections' from source: task vars 7530 1727096047.06980: variable 'interface' from source: play vars 7530 1727096047.07106: variable 'interface' from source: play vars 7530 1727096047.07140: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096047.07258: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096047.07827: variable 'network_connections' from source: task vars 7530 1727096047.07876: variable 'interface' from source: play vars 7530 1727096047.07955: variable 'interface' from source: play vars 7530 1727096047.08106: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096047.08182: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096047.08201: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096047.08264: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096047.08497: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096047.09036: variable 'network_connections' from source: task vars 7530 1727096047.09049: variable 'interface' from source: play vars 7530 1727096047.09120: variable 'interface' from source: play vars 7530 1727096047.09136: variable 'ansible_distribution' from source: facts 7530 1727096047.09144: variable '__network_rh_distros' from source: role '' defaults 7530 1727096047.09155: variable 'ansible_distribution_major_version' from source: facts 7530 1727096047.09191: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096047.09376: variable 'ansible_distribution' from source: facts 7530 1727096047.09391: variable '__network_rh_distros' from source: role '' defaults 7530 1727096047.09401: variable 'ansible_distribution_major_version' from source: facts 7530 1727096047.09418: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096047.09596: variable 'ansible_distribution' from source: facts 7530 1727096047.09718: variable '__network_rh_distros' from source: role '' defaults 7530 1727096047.09721: variable 'ansible_distribution_major_version' from source: facts 7530 1727096047.09723: variable 'network_provider' from source: set_fact 7530 1727096047.09725: variable 'omit' from source: magic vars 7530 1727096047.09727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096047.09751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096047.09776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096047.09797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096047.09812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096047.09852: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096047.09859: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096047.09866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096047.09973: Set connection var ansible_pipelining to False 7530 1727096047.09985: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096047.09994: Set connection var ansible_timeout to 10 7530 1727096047.10006: Set connection var ansible_shell_executable to /bin/sh 7530 1727096047.10011: Set connection var ansible_shell_type to sh 7530 1727096047.10020: Set connection var ansible_connection to ssh 7530 1727096047.10073: variable 'ansible_shell_executable' from source: unknown 7530 1727096047.10081: variable 'ansible_connection' from source: unknown 7530 1727096047.10087: variable 'ansible_module_compression' from source: unknown 7530 1727096047.10093: variable 'ansible_shell_type' from source: unknown 7530 1727096047.10099: variable 'ansible_shell_executable' from source: unknown 7530 1727096047.10258: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096047.10262: variable 'ansible_pipelining' from source: unknown 7530 1727096047.10264: variable 'ansible_timeout' from source: unknown 7530 1727096047.10265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096047.10345: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096047.10362: variable 'omit' from source: magic vars 7530 1727096047.10493: starting attempt loop 7530 1727096047.10496: running the handler 7530 1727096047.10499: variable 'ansible_facts' from source: unknown 7530 1727096047.11451: _low_level_execute_command(): starting 7530 1727096047.11474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096047.12143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096047.12160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096047.12183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096047.12232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096047.12249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096047.12263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096047.12342: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096047.12359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096047.12379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096047.12570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096047.12627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096047.14365: stdout chunk (state=3): >>>/root <<< 7530 1727096047.14538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096047.14570: stdout chunk (state=3): >>><<< 7530 1727096047.14573: stderr chunk (state=3): >>><<< 7530 1727096047.14678: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096047.14682: _low_level_execute_command(): starting 7530 1727096047.14685: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605 `" && echo ansible-tmp-1727096047.1463242-8922-110332604409605="` echo /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605 `" ) && sleep 0' 7530 1727096047.15790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096047.15794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096047.15807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096047.15812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096047.15814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096047.16057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096047.16060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096047.16062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096047.16163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096047.16244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096047.18272: stdout chunk (state=3): >>>ansible-tmp-1727096047.1463242-8922-110332604409605=/root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605 <<< 7530 1727096047.18515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096047.18519: stdout chunk (state=3): >>><<< 7530 1727096047.18521: stderr chunk (state=3): >>><<< 7530 1727096047.18674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096047.1463242-8922-110332604409605=/root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096047.18678: variable 'ansible_module_compression' from source: unknown 7530 1727096047.18681: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7530 1727096047.18730: variable 'ansible_facts' from source: unknown 7530 1727096047.18977: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/AnsiballZ_systemd.py 7530 1727096047.19148: Sending initial data 7530 1727096047.19270: Sent initial data (154 bytes) 7530 1727096047.19947: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096047.20048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096047.20089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096047.20309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096047.21989: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7530 1727096047.22003: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7530 1727096047.22017: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7530 1727096047.22054: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096047.22102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096047.22143: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp7eukqrvl /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/AnsiballZ_systemd.py <<< 7530 1727096047.22147: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/AnsiballZ_systemd.py" <<< 7530 1727096047.22199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp7eukqrvl" to remote "/root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/AnsiballZ_systemd.py" <<< 7530 1727096047.24106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096047.24111: stdout chunk (state=3): >>><<< 7530 1727096047.24113: stderr chunk (state=3): >>><<< 7530 1727096047.24122: done transferring module to remote 7530 1727096047.24138: _low_level_execute_command(): starting 7530 1727096047.24146: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/ /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/AnsiballZ_systemd.py && sleep 0' 7530 1727096047.24887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096047.24997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096047.25039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096047.25106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096047.27061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096047.27065: stdout chunk (state=3): >>><<< 7530 1727096047.27070: stderr chunk (state=3): >>><<< 7530 1727096047.27086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096047.27089: _low_level_execute_command(): starting 7530 1727096047.27095: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/AnsiballZ_systemd.py && sleep 0' 7530 1727096047.27777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096047.27786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096047.27798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096047.28017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096047.28034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096047.28037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096047.58675: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9551872", "MemoryPeak": "10067968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3338498048", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "175656000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", <<< 7530 1727096047.58706: stdout chunk (state=3): >>>"MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target Net<<< 7530 1727096047.58715: stdout chunk (state=3): >>>workManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7530 1727096047.61277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096047.61282: stdout chunk (state=3): >>><<< 7530 1727096047.61284: stderr chunk (state=3): >>><<< 7530 1727096047.61289: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9551872", "MemoryPeak": "10067968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3338498048", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "175656000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096047.61359: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096047.61574: _low_level_execute_command(): starting 7530 1727096047.61577: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096047.1463242-8922-110332604409605/ > /dev/null 2>&1 && sleep 0' 7530 1727096047.62690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096047.62829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096047.62905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096047.64823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096047.65103: stderr chunk (state=3): >>><<< 7530 1727096047.65107: stdout chunk (state=3): >>><<< 7530 1727096047.65110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096047.65112: handler run complete 7530 1727096047.65187: attempt loop complete, returning result 7530 1727096047.65197: _execute() done 7530 1727096047.65209: dumping result to json 7530 1727096047.65237: done dumping result, returning 7530 1727096047.65776: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-086b-f4f0-0000000000c4] 7530 1727096047.65780: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c4 7530 1727096047.66380: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c4 7530 1727096047.66384: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096047.66451: no more pending results, returning what we have 7530 1727096047.66455: results queue empty 7530 1727096047.66456: checking for any_errors_fatal 7530 1727096047.66464: done checking for any_errors_fatal 7530 1727096047.66466: checking for max_fail_percentage 7530 1727096047.66470: done checking for max_fail_percentage 7530 1727096047.66471: checking to see if all hosts have failed and the running result is not ok 7530 1727096047.66472: done checking to see if all hosts have failed 7530 1727096047.66473: getting the remaining hosts for this loop 7530 1727096047.66474: done getting the remaining hosts for this loop 7530 1727096047.66478: getting the next task for host managed_node3 7530 1727096047.66485: done getting next task for host managed_node3 7530 1727096047.66489: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096047.66491: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096047.66503: getting variables 7530 1727096047.66504: in VariableManager get_vars() 7530 1727096047.66560: Calling all_inventory to load vars for managed_node3 7530 1727096047.66563: Calling groups_inventory to load vars for managed_node3 7530 1727096047.66565: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096047.66691: Calling all_plugins_play to load vars for managed_node3 7530 1727096047.66695: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096047.66698: Calling groups_plugins_play to load vars for managed_node3 7530 1727096047.69839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096047.71843: done with get_vars() 7530 1727096047.71880: done getting variables 7530 1727096047.71952: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:54:07 -0400 (0:00:00.743) 0:00:38.508 ****** 7530 1727096047.71993: entering _queue_task() for managed_node3/service 7530 1727096047.72496: worker is 1 (out of 1 available) 7530 1727096047.72508: exiting _queue_task() for managed_node3/service 7530 1727096047.72530: done queuing things up, now waiting for results queue to drain 7530 1727096047.72534: waiting for pending results... 7530 1727096047.72810: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096047.73019: in run() - task 0afff68d-5257-086b-f4f0-0000000000c5 7530 1727096047.73150: variable 'ansible_search_path' from source: unknown 7530 1727096047.73156: variable 'ansible_search_path' from source: unknown 7530 1727096047.73159: calling self._execute() 7530 1727096047.73254: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096047.73270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096047.73289: variable 'omit' from source: magic vars 7530 1727096047.73750: variable 'ansible_distribution_major_version' from source: facts 7530 1727096047.73772: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096047.74003: variable 'network_provider' from source: set_fact 7530 1727096047.74214: Evaluated conditional (network_provider == "nm"): True 7530 1727096047.74304: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096047.74403: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096047.74588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096047.77156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096047.77234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096047.77279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096047.77319: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096047.77350: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096047.77438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.77573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.77578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.77580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.77583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.77614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.77642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.77672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.77719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.77739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.77781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.77809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.77836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.77876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.77916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.78053: variable 'network_connections' from source: task vars 7530 1727096047.78077: variable 'interface' from source: play vars 7530 1727096047.78162: variable 'interface' from source: play vars 7530 1727096047.78354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096047.78466: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096047.78511: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096047.78548: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096047.78591: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096047.78643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096047.78679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096047.78711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.78741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096047.78809: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096047.79097: variable 'network_connections' from source: task vars 7530 1727096047.79114: variable 'interface' from source: play vars 7530 1727096047.79191: variable 'interface' from source: play vars 7530 1727096047.79300: Evaluated conditional (__network_wpa_supplicant_required): False 7530 1727096047.79304: when evaluation is False, skipping this task 7530 1727096047.79306: _execute() done 7530 1727096047.79309: dumping result to json 7530 1727096047.79311: done dumping result, returning 7530 1727096047.79313: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-086b-f4f0-0000000000c5] 7530 1727096047.79325: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c5 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7530 1727096047.79524: no more pending results, returning what we have 7530 1727096047.79528: results queue empty 7530 1727096047.79530: checking for any_errors_fatal 7530 1727096047.79554: done checking for any_errors_fatal 7530 1727096047.79556: checking for max_fail_percentage 7530 1727096047.79558: done checking for max_fail_percentage 7530 1727096047.79559: checking to see if all hosts have failed and the running result is not ok 7530 1727096047.79560: done checking to see if all hosts have failed 7530 1727096047.79561: getting the remaining hosts for this loop 7530 1727096047.79562: done getting the remaining hosts for this loop 7530 1727096047.79567: getting the next task for host managed_node3 7530 1727096047.79575: done getting next task for host managed_node3 7530 1727096047.79579: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096047.79583: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096047.79604: getting variables 7530 1727096047.79606: in VariableManager get_vars() 7530 1727096047.79660: Calling all_inventory to load vars for managed_node3 7530 1727096047.79664: Calling groups_inventory to load vars for managed_node3 7530 1727096047.79785: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096047.79794: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c5 7530 1727096047.79798: WORKER PROCESS EXITING 7530 1727096047.79809: Calling all_plugins_play to load vars for managed_node3 7530 1727096047.79813: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096047.79817: Calling groups_plugins_play to load vars for managed_node3 7530 1727096047.81520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096047.83119: done with get_vars() 7530 1727096047.83153: done getting variables 7530 1727096047.83223: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:54:07 -0400 (0:00:00.112) 0:00:38.620 ****** 7530 1727096047.83259: entering _queue_task() for managed_node3/service 7530 1727096047.83621: worker is 1 (out of 1 available) 7530 1727096047.83634: exiting _queue_task() for managed_node3/service 7530 1727096047.83761: done queuing things up, now waiting for results queue to drain 7530 1727096047.83763: waiting for pending results... 7530 1727096047.83965: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096047.84130: in run() - task 0afff68d-5257-086b-f4f0-0000000000c6 7530 1727096047.84192: variable 'ansible_search_path' from source: unknown 7530 1727096047.84197: variable 'ansible_search_path' from source: unknown 7530 1727096047.84203: calling self._execute() 7530 1727096047.84321: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096047.84335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096047.84350: variable 'omit' from source: magic vars 7530 1727096047.84750: variable 'ansible_distribution_major_version' from source: facts 7530 1727096047.84755: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096047.84878: variable 'network_provider' from source: set_fact 7530 1727096047.84954: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096047.84957: when evaluation is False, skipping this task 7530 1727096047.84959: _execute() done 7530 1727096047.84962: dumping result to json 7530 1727096047.84965: done dumping result, returning 7530 1727096047.84969: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-086b-f4f0-0000000000c6] 7530 1727096047.84972: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c6 7530 1727096047.85048: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c6 7530 1727096047.85051: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096047.85102: no more pending results, returning what we have 7530 1727096047.85107: results queue empty 7530 1727096047.85108: checking for any_errors_fatal 7530 1727096047.85117: done checking for any_errors_fatal 7530 1727096047.85117: checking for max_fail_percentage 7530 1727096047.85119: done checking for max_fail_percentage 7530 1727096047.85120: checking to see if all hosts have failed and the running result is not ok 7530 1727096047.85121: done checking to see if all hosts have failed 7530 1727096047.85122: getting the remaining hosts for this loop 7530 1727096047.85124: done getting the remaining hosts for this loop 7530 1727096047.85127: getting the next task for host managed_node3 7530 1727096047.85135: done getting next task for host managed_node3 7530 1727096047.85139: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096047.85142: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096047.85170: getting variables 7530 1727096047.85172: in VariableManager get_vars() 7530 1727096047.85223: Calling all_inventory to load vars for managed_node3 7530 1727096047.85226: Calling groups_inventory to load vars for managed_node3 7530 1727096047.85228: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096047.85241: Calling all_plugins_play to load vars for managed_node3 7530 1727096047.85245: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096047.85247: Calling groups_plugins_play to load vars for managed_node3 7530 1727096047.86906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096047.88699: done with get_vars() 7530 1727096047.88731: done getting variables 7530 1727096047.88799: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:54:07 -0400 (0:00:00.055) 0:00:38.676 ****** 7530 1727096047.88844: entering _queue_task() for managed_node3/copy 7530 1727096047.89219: worker is 1 (out of 1 available) 7530 1727096047.89232: exiting _queue_task() for managed_node3/copy 7530 1727096047.89245: done queuing things up, now waiting for results queue to drain 7530 1727096047.89246: waiting for pending results... 7530 1727096047.89614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096047.89640: in run() - task 0afff68d-5257-086b-f4f0-0000000000c7 7530 1727096047.89656: variable 'ansible_search_path' from source: unknown 7530 1727096047.89660: variable 'ansible_search_path' from source: unknown 7530 1727096047.89710: calling self._execute() 7530 1727096047.89805: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096047.89817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096047.89828: variable 'omit' from source: magic vars 7530 1727096047.90249: variable 'ansible_distribution_major_version' from source: facts 7530 1727096047.90262: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096047.90392: variable 'network_provider' from source: set_fact 7530 1727096047.90398: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096047.90401: when evaluation is False, skipping this task 7530 1727096047.90405: _execute() done 7530 1727096047.90407: dumping result to json 7530 1727096047.90410: done dumping result, returning 7530 1727096047.90420: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-086b-f4f0-0000000000c7] 7530 1727096047.90424: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c7 7530 1727096047.90773: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c7 7530 1727096047.90776: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7530 1727096047.90815: no more pending results, returning what we have 7530 1727096047.90819: results queue empty 7530 1727096047.90820: checking for any_errors_fatal 7530 1727096047.90825: done checking for any_errors_fatal 7530 1727096047.90826: checking for max_fail_percentage 7530 1727096047.90828: done checking for max_fail_percentage 7530 1727096047.90829: checking to see if all hosts have failed and the running result is not ok 7530 1727096047.90830: done checking to see if all hosts have failed 7530 1727096047.90831: getting the remaining hosts for this loop 7530 1727096047.90832: done getting the remaining hosts for this loop 7530 1727096047.90836: getting the next task for host managed_node3 7530 1727096047.90842: done getting next task for host managed_node3 7530 1727096047.90847: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096047.90850: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096047.90870: getting variables 7530 1727096047.90872: in VariableManager get_vars() 7530 1727096047.90918: Calling all_inventory to load vars for managed_node3 7530 1727096047.90921: Calling groups_inventory to load vars for managed_node3 7530 1727096047.90923: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096047.90932: Calling all_plugins_play to load vars for managed_node3 7530 1727096047.90935: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096047.90938: Calling groups_plugins_play to load vars for managed_node3 7530 1727096047.92251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096047.93844: done with get_vars() 7530 1727096047.93884: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:54:07 -0400 (0:00:00.051) 0:00:38.728 ****** 7530 1727096047.93982: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096047.94395: worker is 1 (out of 1 available) 7530 1727096047.94408: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096047.94422: done queuing things up, now waiting for results queue to drain 7530 1727096047.94423: waiting for pending results... 7530 1727096047.94705: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096047.94848: in run() - task 0afff68d-5257-086b-f4f0-0000000000c8 7530 1727096047.94871: variable 'ansible_search_path' from source: unknown 7530 1727096047.94889: variable 'ansible_search_path' from source: unknown 7530 1727096047.94931: calling self._execute() 7530 1727096047.95047: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096047.95060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096047.95078: variable 'omit' from source: magic vars 7530 1727096047.95465: variable 'ansible_distribution_major_version' from source: facts 7530 1727096047.95537: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096047.95541: variable 'omit' from source: magic vars 7530 1727096047.95558: variable 'omit' from source: magic vars 7530 1727096047.95729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096047.98011: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096047.98089: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096047.98130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096047.98273: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096047.98277: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096047.98297: variable 'network_provider' from source: set_fact 7530 1727096047.98437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096047.98849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096047.98880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096047.98929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096047.98962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096047.99062: variable 'omit' from source: magic vars 7530 1727096047.99205: variable 'omit' from source: magic vars 7530 1727096047.99327: variable 'network_connections' from source: task vars 7530 1727096047.99379: variable 'interface' from source: play vars 7530 1727096047.99421: variable 'interface' from source: play vars 7530 1727096047.99610: variable 'omit' from source: magic vars 7530 1727096047.99624: variable '__lsr_ansible_managed' from source: task vars 7530 1727096047.99689: variable '__lsr_ansible_managed' from source: task vars 7530 1727096048.00034: Loaded config def from plugin (lookup/template) 7530 1727096048.00037: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7530 1727096048.00055: File lookup term: get_ansible_managed.j2 7530 1727096048.00063: variable 'ansible_search_path' from source: unknown 7530 1727096048.00076: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7530 1727096048.00173: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7530 1727096048.00177: variable 'ansible_search_path' from source: unknown 7530 1727096048.07735: variable 'ansible_managed' from source: unknown 7530 1727096048.08277: variable 'omit' from source: magic vars 7530 1727096048.08280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096048.08302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096048.08325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096048.08347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096048.08393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096048.08816: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096048.08819: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096048.08821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096048.09275: Set connection var ansible_pipelining to False 7530 1727096048.09280: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096048.09283: Set connection var ansible_timeout to 10 7530 1727096048.09285: Set connection var ansible_shell_executable to /bin/sh 7530 1727096048.09287: Set connection var ansible_shell_type to sh 7530 1727096048.09289: Set connection var ansible_connection to ssh 7530 1727096048.09291: variable 'ansible_shell_executable' from source: unknown 7530 1727096048.09293: variable 'ansible_connection' from source: unknown 7530 1727096048.09295: variable 'ansible_module_compression' from source: unknown 7530 1727096048.09298: variable 'ansible_shell_type' from source: unknown 7530 1727096048.09300: variable 'ansible_shell_executable' from source: unknown 7530 1727096048.09302: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096048.09304: variable 'ansible_pipelining' from source: unknown 7530 1727096048.09305: variable 'ansible_timeout' from source: unknown 7530 1727096048.09307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096048.09752: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096048.09765: variable 'omit' from source: magic vars 7530 1727096048.10107: starting attempt loop 7530 1727096048.10110: running the handler 7530 1727096048.10113: _low_level_execute_command(): starting 7530 1727096048.10114: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096048.11398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096048.11512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096048.11526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096048.11581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096048.11681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096048.13463: stdout chunk (state=3): >>>/root <<< 7530 1727096048.13511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096048.13554: stderr chunk (state=3): >>><<< 7530 1727096048.13773: stdout chunk (state=3): >>><<< 7530 1727096048.13777: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096048.13780: _low_level_execute_command(): starting 7530 1727096048.13783: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806 `" && echo ansible-tmp-1727096048.136952-8961-35915013915806="` echo /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806 `" ) && sleep 0' 7530 1727096048.15044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096048.15059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096048.15170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096048.15185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096048.15273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096048.15384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096048.15502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096048.17475: stdout chunk (state=3): >>>ansible-tmp-1727096048.136952-8961-35915013915806=/root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806 <<< 7530 1727096048.17732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096048.17736: stdout chunk (state=3): >>><<< 7530 1727096048.17739: stderr chunk (state=3): >>><<< 7530 1727096048.17741: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096048.136952-8961-35915013915806=/root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096048.17743: variable 'ansible_module_compression' from source: unknown 7530 1727096048.17759: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7530 1727096048.17794: variable 'ansible_facts' from source: unknown 7530 1727096048.17904: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/AnsiballZ_network_connections.py 7530 1727096048.18161: Sending initial data 7530 1727096048.18164: Sent initial data (164 bytes) 7530 1727096048.18788: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096048.18848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096048.18878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096048.18895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096048.19137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096048.20795: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096048.20846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096048.20898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpjif7smyg /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/AnsiballZ_network_connections.py <<< 7530 1727096048.20910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/AnsiballZ_network_connections.py" <<< 7530 1727096048.20950: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 7530 1727096048.20972: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpjif7smyg" to remote "/root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/AnsiballZ_network_connections.py" <<< 7530 1727096048.20981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/AnsiballZ_network_connections.py" <<< 7530 1727096048.22651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096048.22655: stdout chunk (state=3): >>><<< 7530 1727096048.22658: stderr chunk (state=3): >>><<< 7530 1727096048.22660: done transferring module to remote 7530 1727096048.22662: _low_level_execute_command(): starting 7530 1727096048.22664: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/ /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/AnsiballZ_network_connections.py && sleep 0' 7530 1727096048.23775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096048.23894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096048.23947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096048.23986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096048.25887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096048.25904: stdout chunk (state=3): >>><<< 7530 1727096048.25922: stderr chunk (state=3): >>><<< 7530 1727096048.25946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096048.25960: _low_level_execute_command(): starting 7530 1727096048.25978: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/AnsiballZ_network_connections.py && sleep 0' 7530 1727096048.27171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096048.27218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096048.27222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096048.27275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096048.27365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096048.75844: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7530 1727096048.77699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096048.77921: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096048.77924: stdout chunk (state=3): >>><<< 7530 1727096048.77927: stderr chunk (state=3): >>><<< 7530 1727096048.77930: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096048.77935: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': False, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096048.77938: _low_level_execute_command(): starting 7530 1727096048.77940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096048.136952-8961-35915013915806/ > /dev/null 2>&1 && sleep 0' 7530 1727096048.79223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096048.79266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096048.79285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096048.79325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096048.79453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096048.81543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096048.81605: stderr chunk (state=3): >>><<< 7530 1727096048.81614: stdout chunk (state=3): >>><<< 7530 1727096048.81639: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096048.81651: handler run complete 7530 1727096048.81692: attempt loop complete, returning result 7530 1727096048.81699: _execute() done 7530 1727096048.81705: dumping result to json 7530 1727096048.81714: done dumping result, returning 7530 1727096048.81727: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-086b-f4f0-0000000000c8] 7530 1727096048.81742: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c8 7530 1727096048.82080: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c8 7530 1727096048.82084: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab (not-active) 7530 1727096048.82218: no more pending results, returning what we have 7530 1727096048.82221: results queue empty 7530 1727096048.82222: checking for any_errors_fatal 7530 1727096048.82230: done checking for any_errors_fatal 7530 1727096048.82230: checking for max_fail_percentage 7530 1727096048.82235: done checking for max_fail_percentage 7530 1727096048.82236: checking to see if all hosts have failed and the running result is not ok 7530 1727096048.82237: done checking to see if all hosts have failed 7530 1727096048.82238: getting the remaining hosts for this loop 7530 1727096048.82239: done getting the remaining hosts for this loop 7530 1727096048.82243: getting the next task for host managed_node3 7530 1727096048.82248: done getting next task for host managed_node3 7530 1727096048.82252: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096048.82254: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096048.82266: getting variables 7530 1727096048.82472: in VariableManager get_vars() 7530 1727096048.82522: Calling all_inventory to load vars for managed_node3 7530 1727096048.82525: Calling groups_inventory to load vars for managed_node3 7530 1727096048.82527: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096048.82539: Calling all_plugins_play to load vars for managed_node3 7530 1727096048.82542: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096048.82545: Calling groups_plugins_play to load vars for managed_node3 7530 1727096048.84210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096048.85663: done with get_vars() 7530 1727096048.85699: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:54:08 -0400 (0:00:00.918) 0:00:39.646 ****** 7530 1727096048.85800: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096048.86163: worker is 1 (out of 1 available) 7530 1727096048.86179: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096048.86192: done queuing things up, now waiting for results queue to drain 7530 1727096048.86193: waiting for pending results... 7530 1727096048.86507: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096048.86672: in run() - task 0afff68d-5257-086b-f4f0-0000000000c9 7530 1727096048.86695: variable 'ansible_search_path' from source: unknown 7530 1727096048.86703: variable 'ansible_search_path' from source: unknown 7530 1727096048.86754: calling self._execute() 7530 1727096048.86875: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096048.86888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096048.86901: variable 'omit' from source: magic vars 7530 1727096048.87312: variable 'ansible_distribution_major_version' from source: facts 7530 1727096048.87334: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096048.87482: variable 'network_state' from source: role '' defaults 7530 1727096048.87507: Evaluated conditional (network_state != {}): False 7530 1727096048.87602: when evaluation is False, skipping this task 7530 1727096048.87606: _execute() done 7530 1727096048.87608: dumping result to json 7530 1727096048.87611: done dumping result, returning 7530 1727096048.87613: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-086b-f4f0-0000000000c9] 7530 1727096048.87615: sending task result for task 0afff68d-5257-086b-f4f0-0000000000c9 7530 1727096048.87695: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000c9 7530 1727096048.87698: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096048.87769: no more pending results, returning what we have 7530 1727096048.87773: results queue empty 7530 1727096048.87774: checking for any_errors_fatal 7530 1727096048.87785: done checking for any_errors_fatal 7530 1727096048.87786: checking for max_fail_percentage 7530 1727096048.87788: done checking for max_fail_percentage 7530 1727096048.87789: checking to see if all hosts have failed and the running result is not ok 7530 1727096048.87790: done checking to see if all hosts have failed 7530 1727096048.87791: getting the remaining hosts for this loop 7530 1727096048.87793: done getting the remaining hosts for this loop 7530 1727096048.87797: getting the next task for host managed_node3 7530 1727096048.87805: done getting next task for host managed_node3 7530 1727096048.87810: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096048.87814: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096048.87844: getting variables 7530 1727096048.87846: in VariableManager get_vars() 7530 1727096048.88008: Calling all_inventory to load vars for managed_node3 7530 1727096048.88012: Calling groups_inventory to load vars for managed_node3 7530 1727096048.88015: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096048.88029: Calling all_plugins_play to load vars for managed_node3 7530 1727096048.88035: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096048.88040: Calling groups_plugins_play to load vars for managed_node3 7530 1727096048.89642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096048.91416: done with get_vars() 7530 1727096048.91462: done getting variables 7530 1727096048.91529: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:54:08 -0400 (0:00:00.057) 0:00:39.703 ****** 7530 1727096048.91575: entering _queue_task() for managed_node3/debug 7530 1727096048.91986: worker is 1 (out of 1 available) 7530 1727096048.92006: exiting _queue_task() for managed_node3/debug 7530 1727096048.92020: done queuing things up, now waiting for results queue to drain 7530 1727096048.92022: waiting for pending results... 7530 1727096048.92387: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096048.92450: in run() - task 0afff68d-5257-086b-f4f0-0000000000ca 7530 1727096048.92475: variable 'ansible_search_path' from source: unknown 7530 1727096048.92484: variable 'ansible_search_path' from source: unknown 7530 1727096048.92525: calling self._execute() 7530 1727096048.92642: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096048.92656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096048.92674: variable 'omit' from source: magic vars 7530 1727096048.93070: variable 'ansible_distribution_major_version' from source: facts 7530 1727096048.93092: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096048.93173: variable 'omit' from source: magic vars 7530 1727096048.93177: variable 'omit' from source: magic vars 7530 1727096048.93214: variable 'omit' from source: magic vars 7530 1727096048.93263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096048.93304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096048.93328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096048.93351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096048.93365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096048.93401: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096048.93409: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096048.93416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096048.93515: Set connection var ansible_pipelining to False 7530 1727096048.93525: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096048.93537: Set connection var ansible_timeout to 10 7530 1727096048.93551: Set connection var ansible_shell_executable to /bin/sh 7530 1727096048.93558: Set connection var ansible_shell_type to sh 7530 1727096048.93572: Set connection var ansible_connection to ssh 7530 1727096048.93772: variable 'ansible_shell_executable' from source: unknown 7530 1727096048.93775: variable 'ansible_connection' from source: unknown 7530 1727096048.93777: variable 'ansible_module_compression' from source: unknown 7530 1727096048.93779: variable 'ansible_shell_type' from source: unknown 7530 1727096048.93781: variable 'ansible_shell_executable' from source: unknown 7530 1727096048.93783: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096048.93784: variable 'ansible_pipelining' from source: unknown 7530 1727096048.93786: variable 'ansible_timeout' from source: unknown 7530 1727096048.93788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096048.93790: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096048.93793: variable 'omit' from source: magic vars 7530 1727096048.93794: starting attempt loop 7530 1727096048.93797: running the handler 7530 1727096048.93925: variable '__network_connections_result' from source: set_fact 7530 1727096048.93989: handler run complete 7530 1727096048.94012: attempt loop complete, returning result 7530 1727096048.94020: _execute() done 7530 1727096048.94026: dumping result to json 7530 1727096048.94037: done dumping result, returning 7530 1727096048.94051: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-086b-f4f0-0000000000ca] 7530 1727096048.94060: sending task result for task 0afff68d-5257-086b-f4f0-0000000000ca 7530 1727096048.94173: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000ca 7530 1727096048.94181: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab (not-active)" ] } 7530 1727096048.94260: no more pending results, returning what we have 7530 1727096048.94264: results queue empty 7530 1727096048.94265: checking for any_errors_fatal 7530 1727096048.94273: done checking for any_errors_fatal 7530 1727096048.94273: checking for max_fail_percentage 7530 1727096048.94275: done checking for max_fail_percentage 7530 1727096048.94276: checking to see if all hosts have failed and the running result is not ok 7530 1727096048.94277: done checking to see if all hosts have failed 7530 1727096048.94278: getting the remaining hosts for this loop 7530 1727096048.94279: done getting the remaining hosts for this loop 7530 1727096048.94283: getting the next task for host managed_node3 7530 1727096048.94290: done getting next task for host managed_node3 7530 1727096048.94293: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096048.94296: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096048.94307: getting variables 7530 1727096048.94308: in VariableManager get_vars() 7530 1727096048.94357: Calling all_inventory to load vars for managed_node3 7530 1727096048.94359: Calling groups_inventory to load vars for managed_node3 7530 1727096048.94362: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096048.94535: Calling all_plugins_play to load vars for managed_node3 7530 1727096048.94539: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096048.94543: Calling groups_plugins_play to load vars for managed_node3 7530 1727096048.96037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096048.97598: done with get_vars() 7530 1727096048.97636: done getting variables 7530 1727096048.97701: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:54:08 -0400 (0:00:00.061) 0:00:39.765 ****** 7530 1727096048.97738: entering _queue_task() for managed_node3/debug 7530 1727096048.98294: worker is 1 (out of 1 available) 7530 1727096048.98306: exiting _queue_task() for managed_node3/debug 7530 1727096048.98320: done queuing things up, now waiting for results queue to drain 7530 1727096048.98321: waiting for pending results... 7530 1727096048.98863: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096048.99128: in run() - task 0afff68d-5257-086b-f4f0-0000000000cb 7530 1727096048.99174: variable 'ansible_search_path' from source: unknown 7530 1727096048.99178: variable 'ansible_search_path' from source: unknown 7530 1727096048.99210: calling self._execute() 7530 1727096048.99374: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096048.99379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096048.99381: variable 'omit' from source: magic vars 7530 1727096048.99750: variable 'ansible_distribution_major_version' from source: facts 7530 1727096048.99769: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096048.99781: variable 'omit' from source: magic vars 7530 1727096049.00073: variable 'omit' from source: magic vars 7530 1727096049.00077: variable 'omit' from source: magic vars 7530 1727096049.00079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096049.00081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096049.00083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096049.00085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096049.00088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096049.00090: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096049.00092: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.00094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.00194: Set connection var ansible_pipelining to False 7530 1727096049.00211: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096049.00221: Set connection var ansible_timeout to 10 7530 1727096049.00236: Set connection var ansible_shell_executable to /bin/sh 7530 1727096049.00243: Set connection var ansible_shell_type to sh 7530 1727096049.00248: Set connection var ansible_connection to ssh 7530 1727096049.00279: variable 'ansible_shell_executable' from source: unknown 7530 1727096049.00287: variable 'ansible_connection' from source: unknown 7530 1727096049.00293: variable 'ansible_module_compression' from source: unknown 7530 1727096049.00299: variable 'ansible_shell_type' from source: unknown 7530 1727096049.00305: variable 'ansible_shell_executable' from source: unknown 7530 1727096049.00317: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.00327: variable 'ansible_pipelining' from source: unknown 7530 1727096049.00337: variable 'ansible_timeout' from source: unknown 7530 1727096049.00345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.00499: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096049.00517: variable 'omit' from source: magic vars 7530 1727096049.00526: starting attempt loop 7530 1727096049.00540: running the handler 7530 1727096049.00596: variable '__network_connections_result' from source: set_fact 7530 1727096049.00701: variable '__network_connections_result' from source: set_fact 7530 1727096049.00865: handler run complete 7530 1727096049.00905: attempt loop complete, returning result 7530 1727096049.00913: _execute() done 7530 1727096049.00920: dumping result to json 7530 1727096049.00929: done dumping result, returning 7530 1727096049.00949: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-086b-f4f0-0000000000cb] 7530 1727096049.00973: sending task result for task 0afff68d-5257-086b-f4f0-0000000000cb 7530 1727096049.01374: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000cb 7530 1727096049.01378: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab (not-active)" ] } } 7530 1727096049.01470: no more pending results, returning what we have 7530 1727096049.01474: results queue empty 7530 1727096049.01475: checking for any_errors_fatal 7530 1727096049.01479: done checking for any_errors_fatal 7530 1727096049.01480: checking for max_fail_percentage 7530 1727096049.01482: done checking for max_fail_percentage 7530 1727096049.01483: checking to see if all hosts have failed and the running result is not ok 7530 1727096049.01483: done checking to see if all hosts have failed 7530 1727096049.01484: getting the remaining hosts for this loop 7530 1727096049.01485: done getting the remaining hosts for this loop 7530 1727096049.01489: getting the next task for host managed_node3 7530 1727096049.01494: done getting next task for host managed_node3 7530 1727096049.01498: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096049.01500: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096049.01510: getting variables 7530 1727096049.01512: in VariableManager get_vars() 7530 1727096049.01572: Calling all_inventory to load vars for managed_node3 7530 1727096049.01575: Calling groups_inventory to load vars for managed_node3 7530 1727096049.01578: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.01587: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.01590: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.01593: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.03040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.04812: done with get_vars() 7530 1727096049.04846: done getting variables 7530 1727096049.04914: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:54:09 -0400 (0:00:00.072) 0:00:39.837 ****** 7530 1727096049.04952: entering _queue_task() for managed_node3/debug 7530 1727096049.05343: worker is 1 (out of 1 available) 7530 1727096049.05356: exiting _queue_task() for managed_node3/debug 7530 1727096049.05572: done queuing things up, now waiting for results queue to drain 7530 1727096049.05574: waiting for pending results... 7530 1727096049.05690: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096049.05852: in run() - task 0afff68d-5257-086b-f4f0-0000000000cc 7530 1727096049.05876: variable 'ansible_search_path' from source: unknown 7530 1727096049.05917: variable 'ansible_search_path' from source: unknown 7530 1727096049.05939: calling self._execute() 7530 1727096049.06244: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.06249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.06252: variable 'omit' from source: magic vars 7530 1727096049.06608: variable 'ansible_distribution_major_version' from source: facts 7530 1727096049.06626: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096049.06760: variable 'network_state' from source: role '' defaults 7530 1727096049.06781: Evaluated conditional (network_state != {}): False 7530 1727096049.06789: when evaluation is False, skipping this task 7530 1727096049.06797: _execute() done 7530 1727096049.06807: dumping result to json 7530 1727096049.06813: done dumping result, returning 7530 1727096049.06824: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-086b-f4f0-0000000000cc] 7530 1727096049.06835: sending task result for task 0afff68d-5257-086b-f4f0-0000000000cc skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7530 1727096049.07023: no more pending results, returning what we have 7530 1727096049.07027: results queue empty 7530 1727096049.07028: checking for any_errors_fatal 7530 1727096049.07046: done checking for any_errors_fatal 7530 1727096049.07047: checking for max_fail_percentage 7530 1727096049.07049: done checking for max_fail_percentage 7530 1727096049.07050: checking to see if all hosts have failed and the running result is not ok 7530 1727096049.07052: done checking to see if all hosts have failed 7530 1727096049.07052: getting the remaining hosts for this loop 7530 1727096049.07054: done getting the remaining hosts for this loop 7530 1727096049.07058: getting the next task for host managed_node3 7530 1727096049.07064: done getting next task for host managed_node3 7530 1727096049.07071: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096049.07074: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096049.07097: getting variables 7530 1727096049.07099: in VariableManager get_vars() 7530 1727096049.07151: Calling all_inventory to load vars for managed_node3 7530 1727096049.07154: Calling groups_inventory to load vars for managed_node3 7530 1727096049.07156: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.07372: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.07377: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.07382: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.08083: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000cc 7530 1727096049.08088: WORKER PROCESS EXITING 7530 1727096049.08848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.11440: done with get_vars() 7530 1727096049.11479: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:54:09 -0400 (0:00:00.066) 0:00:39.904 ****** 7530 1727096049.11616: entering _queue_task() for managed_node3/ping 7530 1727096049.12204: worker is 1 (out of 1 available) 7530 1727096049.12216: exiting _queue_task() for managed_node3/ping 7530 1727096049.12227: done queuing things up, now waiting for results queue to drain 7530 1727096049.12229: waiting for pending results... 7530 1727096049.12312: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096049.12485: in run() - task 0afff68d-5257-086b-f4f0-0000000000cd 7530 1727096049.12508: variable 'ansible_search_path' from source: unknown 7530 1727096049.12517: variable 'ansible_search_path' from source: unknown 7530 1727096049.12600: calling self._execute() 7530 1727096049.12708: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.12745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.12779: variable 'omit' from source: magic vars 7530 1727096049.13202: variable 'ansible_distribution_major_version' from source: facts 7530 1727096049.13225: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096049.13241: variable 'omit' from source: magic vars 7530 1727096049.13309: variable 'omit' from source: magic vars 7530 1727096049.13357: variable 'omit' from source: magic vars 7530 1727096049.13408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096049.13458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096049.13486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096049.13508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096049.13524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096049.13566: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096049.13658: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.13661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.13700: Set connection var ansible_pipelining to False 7530 1727096049.13712: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096049.13722: Set connection var ansible_timeout to 10 7530 1727096049.13738: Set connection var ansible_shell_executable to /bin/sh 7530 1727096049.13745: Set connection var ansible_shell_type to sh 7530 1727096049.13751: Set connection var ansible_connection to ssh 7530 1727096049.13787: variable 'ansible_shell_executable' from source: unknown 7530 1727096049.13794: variable 'ansible_connection' from source: unknown 7530 1727096049.13801: variable 'ansible_module_compression' from source: unknown 7530 1727096049.13808: variable 'ansible_shell_type' from source: unknown 7530 1727096049.13815: variable 'ansible_shell_executable' from source: unknown 7530 1727096049.13821: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.13829: variable 'ansible_pipelining' from source: unknown 7530 1727096049.13838: variable 'ansible_timeout' from source: unknown 7530 1727096049.13846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.14074: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096049.14099: variable 'omit' from source: magic vars 7530 1727096049.14173: starting attempt loop 7530 1727096049.14176: running the handler 7530 1727096049.14179: _low_level_execute_command(): starting 7530 1727096049.14180: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096049.14924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096049.14980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.14996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096049.15083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.15104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096049.15127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.15147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.15228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.16965: stdout chunk (state=3): >>>/root <<< 7530 1727096049.17138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.17142: stdout chunk (state=3): >>><<< 7530 1727096049.17144: stderr chunk (state=3): >>><<< 7530 1727096049.17274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096049.17278: _low_level_execute_command(): starting 7530 1727096049.17280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955 `" && echo ansible-tmp-1727096049.1717434-9015-184430898927955="` echo /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955 `" ) && sleep 0' 7530 1727096049.17876: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096049.17895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096049.17910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.17927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096049.18051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096049.18072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.18125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.18163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.20203: stdout chunk (state=3): >>>ansible-tmp-1727096049.1717434-9015-184430898927955=/root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955 <<< 7530 1727096049.20356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.20360: stdout chunk (state=3): >>><<< 7530 1727096049.20362: stderr chunk (state=3): >>><<< 7530 1727096049.20370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096049.1717434-9015-184430898927955=/root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096049.20417: variable 'ansible_module_compression' from source: unknown 7530 1727096049.20454: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7530 1727096049.20488: variable 'ansible_facts' from source: unknown 7530 1727096049.20541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/AnsiballZ_ping.py 7530 1727096049.20653: Sending initial data 7530 1727096049.20656: Sent initial data (151 bytes) 7530 1727096049.21301: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096049.21305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.21401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.21432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.23154: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096049.23187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096049.23219: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5spxy9kn /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/AnsiballZ_ping.py <<< 7530 1727096049.23226: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/AnsiballZ_ping.py" <<< 7530 1727096049.23251: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp5spxy9kn" to remote "/root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/AnsiballZ_ping.py" <<< 7530 1727096049.23770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.23820: stderr chunk (state=3): >>><<< 7530 1727096049.23825: stdout chunk (state=3): >>><<< 7530 1727096049.23869: done transferring module to remote 7530 1727096049.23883: _low_level_execute_command(): starting 7530 1727096049.23886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/ /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/AnsiballZ_ping.py && sleep 0' 7530 1727096049.24610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096049.24615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096049.24617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.24619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096049.24630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096049.24636: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096049.24638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.24640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096049.24721: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096049.24726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.24819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.24836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.26779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.26784: stdout chunk (state=3): >>><<< 7530 1727096049.26787: stderr chunk (state=3): >>><<< 7530 1727096049.26804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096049.26807: _low_level_execute_command(): starting 7530 1727096049.26816: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/AnsiballZ_ping.py && sleep 0' 7530 1727096049.27478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096049.27495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.27511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.27587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.43612: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7530 1727096049.45112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096049.45151: stderr chunk (state=3): >>><<< 7530 1727096049.45154: stdout chunk (state=3): >>><<< 7530 1727096049.45167: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096049.45190: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096049.45198: _low_level_execute_command(): starting 7530 1727096049.45203: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096049.1717434-9015-184430898927955/ > /dev/null 2>&1 && sleep 0' 7530 1727096049.45672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.45676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.45679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096049.45681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096049.45683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.45738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096049.45741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.45744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.45787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.47685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.47774: stderr chunk (state=3): >>><<< 7530 1727096049.47778: stdout chunk (state=3): >>><<< 7530 1727096049.47781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096049.47790: handler run complete 7530 1727096049.47792: attempt loop complete, returning result 7530 1727096049.47794: _execute() done 7530 1727096049.47795: dumping result to json 7530 1727096049.47798: done dumping result, returning 7530 1727096049.47799: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-086b-f4f0-0000000000cd] 7530 1727096049.47801: sending task result for task 0afff68d-5257-086b-f4f0-0000000000cd 7530 1727096049.47864: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000cd 7530 1727096049.47869: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7530 1727096049.47933: no more pending results, returning what we have 7530 1727096049.47937: results queue empty 7530 1727096049.47938: checking for any_errors_fatal 7530 1727096049.47945: done checking for any_errors_fatal 7530 1727096049.47946: checking for max_fail_percentage 7530 1727096049.47947: done checking for max_fail_percentage 7530 1727096049.47948: checking to see if all hosts have failed and the running result is not ok 7530 1727096049.47949: done checking to see if all hosts have failed 7530 1727096049.47950: getting the remaining hosts for this loop 7530 1727096049.47952: done getting the remaining hosts for this loop 7530 1727096049.47955: getting the next task for host managed_node3 7530 1727096049.47964: done getting next task for host managed_node3 7530 1727096049.47966: ^ task is: TASK: meta (role_complete) 7530 1727096049.47970: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096049.47981: getting variables 7530 1727096049.47983: in VariableManager get_vars() 7530 1727096049.48030: Calling all_inventory to load vars for managed_node3 7530 1727096049.48032: Calling groups_inventory to load vars for managed_node3 7530 1727096049.48035: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.48045: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.48048: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.48050: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.48997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.50376: done with get_vars() 7530 1727096049.50411: done getting variables 7530 1727096049.50503: done queuing things up, now waiting for results queue to drain 7530 1727096049.50505: results queue empty 7530 1727096049.50506: checking for any_errors_fatal 7530 1727096049.50508: done checking for any_errors_fatal 7530 1727096049.50509: checking for max_fail_percentage 7530 1727096049.50510: done checking for max_fail_percentage 7530 1727096049.50511: checking to see if all hosts have failed and the running result is not ok 7530 1727096049.50512: done checking to see if all hosts have failed 7530 1727096049.50512: getting the remaining hosts for this loop 7530 1727096049.50513: done getting the remaining hosts for this loop 7530 1727096049.50516: getting the next task for host managed_node3 7530 1727096049.50520: done getting next task for host managed_node3 7530 1727096049.50522: ^ task is: TASK: Include the task 'assert_device_present.yml' 7530 1727096049.50524: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096049.50526: getting variables 7530 1727096049.50527: in VariableManager get_vars() 7530 1727096049.50548: Calling all_inventory to load vars for managed_node3 7530 1727096049.50550: Calling groups_inventory to load vars for managed_node3 7530 1727096049.50552: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.50557: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.50560: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.50562: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.51264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.52113: done with get_vars() 7530 1727096049.52135: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:108 Monday 23 September 2024 08:54:09 -0400 (0:00:00.405) 0:00:40.310 ****** 7530 1727096049.52193: entering _queue_task() for managed_node3/include_tasks 7530 1727096049.52549: worker is 1 (out of 1 available) 7530 1727096049.52565: exiting _queue_task() for managed_node3/include_tasks 7530 1727096049.52578: done queuing things up, now waiting for results queue to drain 7530 1727096049.52580: waiting for pending results... 7530 1727096049.52891: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7530 1727096049.52952: in run() - task 0afff68d-5257-086b-f4f0-0000000000fd 7530 1727096049.52984: variable 'ansible_search_path' from source: unknown 7530 1727096049.53027: calling self._execute() 7530 1727096049.53146: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.53156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.53178: variable 'omit' from source: magic vars 7530 1727096049.53610: variable 'ansible_distribution_major_version' from source: facts 7530 1727096049.53613: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096049.53616: _execute() done 7530 1727096049.53674: dumping result to json 7530 1727096049.53678: done dumping result, returning 7530 1727096049.53681: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0afff68d-5257-086b-f4f0-0000000000fd] 7530 1727096049.53684: sending task result for task 0afff68d-5257-086b-f4f0-0000000000fd 7530 1727096049.53762: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000fd 7530 1727096049.53766: WORKER PROCESS EXITING 7530 1727096049.53802: no more pending results, returning what we have 7530 1727096049.53808: in VariableManager get_vars() 7530 1727096049.53874: Calling all_inventory to load vars for managed_node3 7530 1727096049.53880: Calling groups_inventory to load vars for managed_node3 7530 1727096049.53883: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.53897: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.53900: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.53903: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.55194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.56049: done with get_vars() 7530 1727096049.56073: variable 'ansible_search_path' from source: unknown 7530 1727096049.56086: we have included files to process 7530 1727096049.56087: generating all_blocks data 7530 1727096049.56089: done generating all_blocks data 7530 1727096049.56093: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096049.56094: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096049.56095: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7530 1727096049.56177: in VariableManager get_vars() 7530 1727096049.56196: done with get_vars() 7530 1727096049.56281: done processing included file 7530 1727096049.56282: iterating over new_blocks loaded from include file 7530 1727096049.56284: in VariableManager get_vars() 7530 1727096049.56310: done with get_vars() 7530 1727096049.56312: filtering new block on tags 7530 1727096049.56333: done filtering new block on tags 7530 1727096049.56335: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7530 1727096049.56341: extending task lists for all hosts with included blocks 7530 1727096049.59897: done extending task lists 7530 1727096049.59899: done processing included files 7530 1727096049.59900: results queue empty 7530 1727096049.59900: checking for any_errors_fatal 7530 1727096049.59901: done checking for any_errors_fatal 7530 1727096049.59902: checking for max_fail_percentage 7530 1727096049.59903: done checking for max_fail_percentage 7530 1727096049.59903: checking to see if all hosts have failed and the running result is not ok 7530 1727096049.59904: done checking to see if all hosts have failed 7530 1727096049.59905: getting the remaining hosts for this loop 7530 1727096049.59906: done getting the remaining hosts for this loop 7530 1727096049.59907: getting the next task for host managed_node3 7530 1727096049.59910: done getting next task for host managed_node3 7530 1727096049.59912: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7530 1727096049.59914: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096049.59916: getting variables 7530 1727096049.59916: in VariableManager get_vars() 7530 1727096049.59936: Calling all_inventory to load vars for managed_node3 7530 1727096049.59938: Calling groups_inventory to load vars for managed_node3 7530 1727096049.59939: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.59945: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.59946: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.59948: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.65265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.66125: done with get_vars() 7530 1727096049.66152: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:54:09 -0400 (0:00:00.140) 0:00:40.450 ****** 7530 1727096049.66213: entering _queue_task() for managed_node3/include_tasks 7530 1727096049.66486: worker is 1 (out of 1 available) 7530 1727096049.66500: exiting _queue_task() for managed_node3/include_tasks 7530 1727096049.66512: done queuing things up, now waiting for results queue to drain 7530 1727096049.66514: waiting for pending results... 7530 1727096049.66719: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7530 1727096049.66807: in run() - task 0afff68d-5257-086b-f4f0-00000000143a 7530 1727096049.66820: variable 'ansible_search_path' from source: unknown 7530 1727096049.66824: variable 'ansible_search_path' from source: unknown 7530 1727096049.66856: calling self._execute() 7530 1727096049.66943: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.66949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.66959: variable 'omit' from source: magic vars 7530 1727096049.67270: variable 'ansible_distribution_major_version' from source: facts 7530 1727096049.67282: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096049.67288: _execute() done 7530 1727096049.67291: dumping result to json 7530 1727096049.67293: done dumping result, returning 7530 1727096049.67302: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-086b-f4f0-00000000143a] 7530 1727096049.67306: sending task result for task 0afff68d-5257-086b-f4f0-00000000143a 7530 1727096049.67410: done sending task result for task 0afff68d-5257-086b-f4f0-00000000143a 7530 1727096049.67413: WORKER PROCESS EXITING 7530 1727096049.67441: no more pending results, returning what we have 7530 1727096049.67445: in VariableManager get_vars() 7530 1727096049.67512: Calling all_inventory to load vars for managed_node3 7530 1727096049.67515: Calling groups_inventory to load vars for managed_node3 7530 1727096049.67517: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.67530: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.67533: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.67536: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.68358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.69242: done with get_vars() 7530 1727096049.69265: variable 'ansible_search_path' from source: unknown 7530 1727096049.69266: variable 'ansible_search_path' from source: unknown 7530 1727096049.69298: we have included files to process 7530 1727096049.69299: generating all_blocks data 7530 1727096049.69300: done generating all_blocks data 7530 1727096049.69301: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096049.69302: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096049.69303: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7530 1727096049.69443: done processing included file 7530 1727096049.69445: iterating over new_blocks loaded from include file 7530 1727096049.69447: in VariableManager get_vars() 7530 1727096049.69465: done with get_vars() 7530 1727096049.69466: filtering new block on tags 7530 1727096049.69479: done filtering new block on tags 7530 1727096049.69480: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7530 1727096049.69484: extending task lists for all hosts with included blocks 7530 1727096049.69553: done extending task lists 7530 1727096049.69554: done processing included files 7530 1727096049.69554: results queue empty 7530 1727096049.69555: checking for any_errors_fatal 7530 1727096049.69557: done checking for any_errors_fatal 7530 1727096049.69558: checking for max_fail_percentage 7530 1727096049.69558: done checking for max_fail_percentage 7530 1727096049.69559: checking to see if all hosts have failed and the running result is not ok 7530 1727096049.69560: done checking to see if all hosts have failed 7530 1727096049.69560: getting the remaining hosts for this loop 7530 1727096049.69561: done getting the remaining hosts for this loop 7530 1727096049.69563: getting the next task for host managed_node3 7530 1727096049.69565: done getting next task for host managed_node3 7530 1727096049.69566: ^ task is: TASK: Get stat for interface {{ interface }} 7530 1727096049.69570: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096049.69572: getting variables 7530 1727096049.69573: in VariableManager get_vars() 7530 1727096049.69585: Calling all_inventory to load vars for managed_node3 7530 1727096049.69586: Calling groups_inventory to load vars for managed_node3 7530 1727096049.69588: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096049.69592: Calling all_plugins_play to load vars for managed_node3 7530 1727096049.69593: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096049.69595: Calling groups_plugins_play to load vars for managed_node3 7530 1727096049.70319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096049.71194: done with get_vars() 7530 1727096049.71219: done getting variables 7530 1727096049.71346: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:54:09 -0400 (0:00:00.051) 0:00:40.502 ****** 7530 1727096049.71372: entering _queue_task() for managed_node3/stat 7530 1727096049.71634: worker is 1 (out of 1 available) 7530 1727096049.71647: exiting _queue_task() for managed_node3/stat 7530 1727096049.71660: done queuing things up, now waiting for results queue to drain 7530 1727096049.71661: waiting for pending results... 7530 1727096049.71857: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7530 1727096049.71951: in run() - task 0afff68d-5257-086b-f4f0-0000000016ba 7530 1727096049.71962: variable 'ansible_search_path' from source: unknown 7530 1727096049.71966: variable 'ansible_search_path' from source: unknown 7530 1727096049.72004: calling self._execute() 7530 1727096049.72091: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.72097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.72112: variable 'omit' from source: magic vars 7530 1727096049.72407: variable 'ansible_distribution_major_version' from source: facts 7530 1727096049.72418: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096049.72426: variable 'omit' from source: magic vars 7530 1727096049.72469: variable 'omit' from source: magic vars 7530 1727096049.72549: variable 'interface' from source: play vars 7530 1727096049.72562: variable 'omit' from source: magic vars 7530 1727096049.72601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096049.72629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096049.72654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096049.72664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096049.72675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096049.72699: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096049.72702: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.72705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.72781: Set connection var ansible_pipelining to False 7530 1727096049.72787: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096049.72792: Set connection var ansible_timeout to 10 7530 1727096049.72800: Set connection var ansible_shell_executable to /bin/sh 7530 1727096049.72803: Set connection var ansible_shell_type to sh 7530 1727096049.72805: Set connection var ansible_connection to ssh 7530 1727096049.72827: variable 'ansible_shell_executable' from source: unknown 7530 1727096049.72830: variable 'ansible_connection' from source: unknown 7530 1727096049.72835: variable 'ansible_module_compression' from source: unknown 7530 1727096049.72838: variable 'ansible_shell_type' from source: unknown 7530 1727096049.72840: variable 'ansible_shell_executable' from source: unknown 7530 1727096049.72843: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096049.72845: variable 'ansible_pipelining' from source: unknown 7530 1727096049.72847: variable 'ansible_timeout' from source: unknown 7530 1727096049.72849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096049.73005: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096049.73015: variable 'omit' from source: magic vars 7530 1727096049.73020: starting attempt loop 7530 1727096049.73023: running the handler 7530 1727096049.73037: _low_level_execute_command(): starting 7530 1727096049.73044: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096049.73566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.73581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.73595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.73654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096049.73657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.73660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.73707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.75486: stdout chunk (state=3): >>>/root <<< 7530 1727096049.75581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.75615: stderr chunk (state=3): >>><<< 7530 1727096049.75619: stdout chunk (state=3): >>><<< 7530 1727096049.75644: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096049.75657: _low_level_execute_command(): starting 7530 1727096049.75663: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215 `" && echo ansible-tmp-1727096049.7564483-9040-238136815214215="` echo /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215 `" ) && sleep 0' 7530 1727096049.76152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.76156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.76176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.76218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096049.76222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.76224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.76270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.78384: stdout chunk (state=3): >>>ansible-tmp-1727096049.7564483-9040-238136815214215=/root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215 <<< 7530 1727096049.78558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.78563: stdout chunk (state=3): >>><<< 7530 1727096049.78565: stderr chunk (state=3): >>><<< 7530 1727096049.78775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096049.7564483-9040-238136815214215=/root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096049.78779: variable 'ansible_module_compression' from source: unknown 7530 1727096049.78782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7530 1727096049.78784: variable 'ansible_facts' from source: unknown 7530 1727096049.78885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/AnsiballZ_stat.py 7530 1727096049.79153: Sending initial data 7530 1727096049.79157: Sent initial data (151 bytes) 7530 1727096049.79970: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096049.79993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096049.80133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.80158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.80242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.81945: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096049.81973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096049.82004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmphcuqv091 /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/AnsiballZ_stat.py <<< 7530 1727096049.82014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/AnsiballZ_stat.py" <<< 7530 1727096049.82035: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmphcuqv091" to remote "/root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/AnsiballZ_stat.py" <<< 7530 1727096049.82039: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/AnsiballZ_stat.py" <<< 7530 1727096049.82553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.82600: stderr chunk (state=3): >>><<< 7530 1727096049.82603: stdout chunk (state=3): >>><<< 7530 1727096049.82650: done transferring module to remote 7530 1727096049.82659: _low_level_execute_command(): starting 7530 1727096049.82664: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/ /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/AnsiballZ_stat.py && sleep 0' 7530 1727096049.83125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.83129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.83134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.83137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.83181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096049.83204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.83243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096049.85144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096049.85172: stderr chunk (state=3): >>><<< 7530 1727096049.85175: stdout chunk (state=3): >>><<< 7530 1727096049.85191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096049.85194: _low_level_execute_command(): starting 7530 1727096049.85207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/AnsiballZ_stat.py && sleep 0' 7530 1727096049.85681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096049.85685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.85688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096049.85690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096049.85736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096049.85739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096049.85742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096049.85799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.01944: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25583, "dev": 23, "nlink": 1, "atime": 1727096041.3412454, "mtime": 1727096041.3412454, "ctime": 1727096041.3412454, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7530 1727096050.03428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096050.03460: stderr chunk (state=3): >>><<< 7530 1727096050.03463: stdout chunk (state=3): >>><<< 7530 1727096050.03481: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25583, "dev": 23, "nlink": 1, "atime": 1727096041.3412454, "mtime": 1727096041.3412454, "ctime": 1727096041.3412454, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096050.03519: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096050.03527: _low_level_execute_command(): starting 7530 1727096050.03532: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096049.7564483-9040-238136815214215/ > /dev/null 2>&1 && sleep 0' 7530 1727096050.04000: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.04004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.04007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096050.04009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.04062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.04065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.04073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.04108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.06032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.06061: stderr chunk (state=3): >>><<< 7530 1727096050.06064: stdout chunk (state=3): >>><<< 7530 1727096050.06082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.06088: handler run complete 7530 1727096050.06119: attempt loop complete, returning result 7530 1727096050.06122: _execute() done 7530 1727096050.06124: dumping result to json 7530 1727096050.06129: done dumping result, returning 7530 1727096050.06138: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0afff68d-5257-086b-f4f0-0000000016ba] 7530 1727096050.06142: sending task result for task 0afff68d-5257-086b-f4f0-0000000016ba 7530 1727096050.06253: done sending task result for task 0afff68d-5257-086b-f4f0-0000000016ba 7530 1727096050.06256: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096041.3412454, "block_size": 4096, "blocks": 0, "ctime": 1727096041.3412454, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25583, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727096041.3412454, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7530 1727096050.06357: no more pending results, returning what we have 7530 1727096050.06361: results queue empty 7530 1727096050.06362: checking for any_errors_fatal 7530 1727096050.06364: done checking for any_errors_fatal 7530 1727096050.06364: checking for max_fail_percentage 7530 1727096050.06366: done checking for max_fail_percentage 7530 1727096050.06369: checking to see if all hosts have failed and the running result is not ok 7530 1727096050.06370: done checking to see if all hosts have failed 7530 1727096050.06370: getting the remaining hosts for this loop 7530 1727096050.06372: done getting the remaining hosts for this loop 7530 1727096050.06375: getting the next task for host managed_node3 7530 1727096050.06385: done getting next task for host managed_node3 7530 1727096050.06388: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7530 1727096050.06390: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096050.06394: getting variables 7530 1727096050.06395: in VariableManager get_vars() 7530 1727096050.06442: Calling all_inventory to load vars for managed_node3 7530 1727096050.06445: Calling groups_inventory to load vars for managed_node3 7530 1727096050.06447: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.06457: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.06460: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.06462: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.07284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.08191: done with get_vars() 7530 1727096050.08219: done getting variables 7530 1727096050.08270: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096050.08364: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:54:10 -0400 (0:00:00.370) 0:00:40.872 ****** 7530 1727096050.08389: entering _queue_task() for managed_node3/assert 7530 1727096050.08656: worker is 1 (out of 1 available) 7530 1727096050.08671: exiting _queue_task() for managed_node3/assert 7530 1727096050.08682: done queuing things up, now waiting for results queue to drain 7530 1727096050.08684: waiting for pending results... 7530 1727096050.08869: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7530 1727096050.08946: in run() - task 0afff68d-5257-086b-f4f0-00000000143b 7530 1727096050.08958: variable 'ansible_search_path' from source: unknown 7530 1727096050.08962: variable 'ansible_search_path' from source: unknown 7530 1727096050.08994: calling self._execute() 7530 1727096050.09077: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.09083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.09093: variable 'omit' from source: magic vars 7530 1727096050.09384: variable 'ansible_distribution_major_version' from source: facts 7530 1727096050.09394: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096050.09400: variable 'omit' from source: magic vars 7530 1727096050.09426: variable 'omit' from source: magic vars 7530 1727096050.09502: variable 'interface' from source: play vars 7530 1727096050.09516: variable 'omit' from source: magic vars 7530 1727096050.09551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096050.09583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096050.09599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096050.09612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.09623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.09647: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096050.09650: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.09653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.09727: Set connection var ansible_pipelining to False 7530 1727096050.09730: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096050.09737: Set connection var ansible_timeout to 10 7530 1727096050.09745: Set connection var ansible_shell_executable to /bin/sh 7530 1727096050.09748: Set connection var ansible_shell_type to sh 7530 1727096050.09750: Set connection var ansible_connection to ssh 7530 1727096050.09770: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.09773: variable 'ansible_connection' from source: unknown 7530 1727096050.09781: variable 'ansible_module_compression' from source: unknown 7530 1727096050.09786: variable 'ansible_shell_type' from source: unknown 7530 1727096050.09788: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.09791: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.09793: variable 'ansible_pipelining' from source: unknown 7530 1727096050.09795: variable 'ansible_timeout' from source: unknown 7530 1727096050.09797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.09899: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096050.09913: variable 'omit' from source: magic vars 7530 1727096050.09916: starting attempt loop 7530 1727096050.09919: running the handler 7530 1727096050.10023: variable 'interface_stat' from source: set_fact 7530 1727096050.10111: Evaluated conditional (interface_stat.stat.exists): True 7530 1727096050.10114: handler run complete 7530 1727096050.10116: attempt loop complete, returning result 7530 1727096050.10118: _execute() done 7530 1727096050.10119: dumping result to json 7530 1727096050.10121: done dumping result, returning 7530 1727096050.10124: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0afff68d-5257-086b-f4f0-00000000143b] 7530 1727096050.10126: sending task result for task 0afff68d-5257-086b-f4f0-00000000143b ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096050.10246: no more pending results, returning what we have 7530 1727096050.10249: results queue empty 7530 1727096050.10250: checking for any_errors_fatal 7530 1727096050.10261: done checking for any_errors_fatal 7530 1727096050.10262: checking for max_fail_percentage 7530 1727096050.10263: done checking for max_fail_percentage 7530 1727096050.10264: checking to see if all hosts have failed and the running result is not ok 7530 1727096050.10265: done checking to see if all hosts have failed 7530 1727096050.10266: getting the remaining hosts for this loop 7530 1727096050.10269: done getting the remaining hosts for this loop 7530 1727096050.10272: getting the next task for host managed_node3 7530 1727096050.10280: done getting next task for host managed_node3 7530 1727096050.10283: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7530 1727096050.10285: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096050.10289: getting variables 7530 1727096050.10290: in VariableManager get_vars() 7530 1727096050.10338: Calling all_inventory to load vars for managed_node3 7530 1727096050.10340: Calling groups_inventory to load vars for managed_node3 7530 1727096050.10342: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.10352: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.10354: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.10356: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.10884: done sending task result for task 0afff68d-5257-086b-f4f0-00000000143b 7530 1727096050.10888: WORKER PROCESS EXITING 7530 1727096050.11293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.12183: done with get_vars() 7530 1727096050.12209: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:110 Monday 23 September 2024 08:54:10 -0400 (0:00:00.038) 0:00:40.911 ****** 7530 1727096050.12284: entering _queue_task() for managed_node3/include_tasks 7530 1727096050.12554: worker is 1 (out of 1 available) 7530 1727096050.12569: exiting _queue_task() for managed_node3/include_tasks 7530 1727096050.12582: done queuing things up, now waiting for results queue to drain 7530 1727096050.12584: waiting for pending results... 7530 1727096050.12771: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7530 1727096050.12837: in run() - task 0afff68d-5257-086b-f4f0-0000000000fe 7530 1727096050.12857: variable 'ansible_search_path' from source: unknown 7530 1727096050.12889: calling self._execute() 7530 1727096050.12978: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.12984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.12994: variable 'omit' from source: magic vars 7530 1727096050.13281: variable 'ansible_distribution_major_version' from source: facts 7530 1727096050.13292: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096050.13297: _execute() done 7530 1727096050.13301: dumping result to json 7530 1727096050.13304: done dumping result, returning 7530 1727096050.13311: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-086b-f4f0-0000000000fe] 7530 1727096050.13315: sending task result for task 0afff68d-5257-086b-f4f0-0000000000fe 7530 1727096050.13407: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000fe 7530 1727096050.13411: WORKER PROCESS EXITING 7530 1727096050.13443: no more pending results, returning what we have 7530 1727096050.13447: in VariableManager get_vars() 7530 1727096050.13509: Calling all_inventory to load vars for managed_node3 7530 1727096050.13512: Calling groups_inventory to load vars for managed_node3 7530 1727096050.13514: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.13537: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.13540: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.13544: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.14378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.15380: done with get_vars() 7530 1727096050.15396: variable 'ansible_search_path' from source: unknown 7530 1727096050.15410: we have included files to process 7530 1727096050.15410: generating all_blocks data 7530 1727096050.15412: done generating all_blocks data 7530 1727096050.15415: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7530 1727096050.15416: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7530 1727096050.15417: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7530 1727096050.15510: in VariableManager get_vars() 7530 1727096050.15530: done with get_vars() 7530 1727096050.15716: done processing included file 7530 1727096050.15718: iterating over new_blocks loaded from include file 7530 1727096050.15719: in VariableManager get_vars() 7530 1727096050.15738: done with get_vars() 7530 1727096050.15740: filtering new block on tags 7530 1727096050.15754: done filtering new block on tags 7530 1727096050.15755: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7530 1727096050.15760: extending task lists for all hosts with included blocks 7530 1727096050.18936: done extending task lists 7530 1727096050.18939: done processing included files 7530 1727096050.18939: results queue empty 7530 1727096050.18940: checking for any_errors_fatal 7530 1727096050.18943: done checking for any_errors_fatal 7530 1727096050.18944: checking for max_fail_percentage 7530 1727096050.18945: done checking for max_fail_percentage 7530 1727096050.18946: checking to see if all hosts have failed and the running result is not ok 7530 1727096050.18947: done checking to see if all hosts have failed 7530 1727096050.18948: getting the remaining hosts for this loop 7530 1727096050.18949: done getting the remaining hosts for this loop 7530 1727096050.18951: getting the next task for host managed_node3 7530 1727096050.18954: done getting next task for host managed_node3 7530 1727096050.18955: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7530 1727096050.18957: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096050.18959: getting variables 7530 1727096050.18960: in VariableManager get_vars() 7530 1727096050.18979: Calling all_inventory to load vars for managed_node3 7530 1727096050.18981: Calling groups_inventory to load vars for managed_node3 7530 1727096050.18982: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.18988: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.18989: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.18991: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.19671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.20682: done with get_vars() 7530 1727096050.20702: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:54:10 -0400 (0:00:00.084) 0:00:40.995 ****** 7530 1727096050.20763: entering _queue_task() for managed_node3/include_tasks 7530 1727096050.21036: worker is 1 (out of 1 available) 7530 1727096050.21048: exiting _queue_task() for managed_node3/include_tasks 7530 1727096050.21061: done queuing things up, now waiting for results queue to drain 7530 1727096050.21062: waiting for pending results... 7530 1727096050.21247: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7530 1727096050.21317: in run() - task 0afff68d-5257-086b-f4f0-0000000016d2 7530 1727096050.21329: variable 'ansible_search_path' from source: unknown 7530 1727096050.21336: variable 'ansible_search_path' from source: unknown 7530 1727096050.21362: calling self._execute() 7530 1727096050.21448: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.21452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.21462: variable 'omit' from source: magic vars 7530 1727096050.21750: variable 'ansible_distribution_major_version' from source: facts 7530 1727096050.21761: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096050.21769: _execute() done 7530 1727096050.21774: dumping result to json 7530 1727096050.21776: done dumping result, returning 7530 1727096050.21783: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-086b-f4f0-0000000016d2] 7530 1727096050.21788: sending task result for task 0afff68d-5257-086b-f4f0-0000000016d2 7530 1727096050.21880: done sending task result for task 0afff68d-5257-086b-f4f0-0000000016d2 7530 1727096050.21883: WORKER PROCESS EXITING 7530 1727096050.21913: no more pending results, returning what we have 7530 1727096050.21918: in VariableManager get_vars() 7530 1727096050.21987: Calling all_inventory to load vars for managed_node3 7530 1727096050.21990: Calling groups_inventory to load vars for managed_node3 7530 1727096050.21994: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.22008: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.22011: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.22013: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.22850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.23736: done with get_vars() 7530 1727096050.23758: variable 'ansible_search_path' from source: unknown 7530 1727096050.23760: variable 'ansible_search_path' from source: unknown 7530 1727096050.23791: we have included files to process 7530 1727096050.23792: generating all_blocks data 7530 1727096050.23793: done generating all_blocks data 7530 1727096050.23794: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7530 1727096050.23795: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7530 1727096050.23796: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7530 1727096050.24419: done processing included file 7530 1727096050.24421: iterating over new_blocks loaded from include file 7530 1727096050.24422: in VariableManager get_vars() 7530 1727096050.24444: done with get_vars() 7530 1727096050.24446: filtering new block on tags 7530 1727096050.24463: done filtering new block on tags 7530 1727096050.24465: in VariableManager get_vars() 7530 1727096050.24483: done with get_vars() 7530 1727096050.24484: filtering new block on tags 7530 1727096050.24497: done filtering new block on tags 7530 1727096050.24499: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7530 1727096050.24503: extending task lists for all hosts with included blocks 7530 1727096050.24603: done extending task lists 7530 1727096050.24604: done processing included files 7530 1727096050.24605: results queue empty 7530 1727096050.24605: checking for any_errors_fatal 7530 1727096050.24608: done checking for any_errors_fatal 7530 1727096050.24609: checking for max_fail_percentage 7530 1727096050.24609: done checking for max_fail_percentage 7530 1727096050.24610: checking to see if all hosts have failed and the running result is not ok 7530 1727096050.24611: done checking to see if all hosts have failed 7530 1727096050.24611: getting the remaining hosts for this loop 7530 1727096050.24612: done getting the remaining hosts for this loop 7530 1727096050.24613: getting the next task for host managed_node3 7530 1727096050.24616: done getting next task for host managed_node3 7530 1727096050.24617: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7530 1727096050.24619: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096050.24621: getting variables 7530 1727096050.24622: in VariableManager get_vars() 7530 1727096050.24635: Calling all_inventory to load vars for managed_node3 7530 1727096050.24637: Calling groups_inventory to load vars for managed_node3 7530 1727096050.24638: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.24643: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.24644: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.24646: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.25382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.26251: done with get_vars() 7530 1727096050.26277: done getting variables 7530 1727096050.26314: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:54:10 -0400 (0:00:00.055) 0:00:41.051 ****** 7530 1727096050.26341: entering _queue_task() for managed_node3/set_fact 7530 1727096050.26609: worker is 1 (out of 1 available) 7530 1727096050.26625: exiting _queue_task() for managed_node3/set_fact 7530 1727096050.26640: done queuing things up, now waiting for results queue to drain 7530 1727096050.26642: waiting for pending results... 7530 1727096050.26828: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7530 1727096050.26912: in run() - task 0afff68d-5257-086b-f4f0-00000000195f 7530 1727096050.26923: variable 'ansible_search_path' from source: unknown 7530 1727096050.26928: variable 'ansible_search_path' from source: unknown 7530 1727096050.26958: calling self._execute() 7530 1727096050.27044: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.27048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.27058: variable 'omit' from source: magic vars 7530 1727096050.27352: variable 'ansible_distribution_major_version' from source: facts 7530 1727096050.27362: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096050.27371: variable 'omit' from source: magic vars 7530 1727096050.27409: variable 'omit' from source: magic vars 7530 1727096050.27474: variable 'omit' from source: magic vars 7530 1727096050.27478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096050.27504: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096050.27523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096050.27539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.27549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.27574: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096050.27577: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.27580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.27652: Set connection var ansible_pipelining to False 7530 1727096050.27657: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096050.27662: Set connection var ansible_timeout to 10 7530 1727096050.27671: Set connection var ansible_shell_executable to /bin/sh 7530 1727096050.27675: Set connection var ansible_shell_type to sh 7530 1727096050.27677: Set connection var ansible_connection to ssh 7530 1727096050.27696: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.27699: variable 'ansible_connection' from source: unknown 7530 1727096050.27702: variable 'ansible_module_compression' from source: unknown 7530 1727096050.27704: variable 'ansible_shell_type' from source: unknown 7530 1727096050.27706: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.27708: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.27710: variable 'ansible_pipelining' from source: unknown 7530 1727096050.27715: variable 'ansible_timeout' from source: unknown 7530 1727096050.27718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.27824: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096050.27836: variable 'omit' from source: magic vars 7530 1727096050.27840: starting attempt loop 7530 1727096050.27842: running the handler 7530 1727096050.27855: handler run complete 7530 1727096050.27866: attempt loop complete, returning result 7530 1727096050.27870: _execute() done 7530 1727096050.27873: dumping result to json 7530 1727096050.27875: done dumping result, returning 7530 1727096050.27880: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-086b-f4f0-00000000195f] 7530 1727096050.27885: sending task result for task 0afff68d-5257-086b-f4f0-00000000195f 7530 1727096050.27971: done sending task result for task 0afff68d-5257-086b-f4f0-00000000195f 7530 1727096050.27974: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7530 1727096050.28030: no more pending results, returning what we have 7530 1727096050.28036: results queue empty 7530 1727096050.28037: checking for any_errors_fatal 7530 1727096050.28039: done checking for any_errors_fatal 7530 1727096050.28039: checking for max_fail_percentage 7530 1727096050.28041: done checking for max_fail_percentage 7530 1727096050.28042: checking to see if all hosts have failed and the running result is not ok 7530 1727096050.28043: done checking to see if all hosts have failed 7530 1727096050.28043: getting the remaining hosts for this loop 7530 1727096050.28045: done getting the remaining hosts for this loop 7530 1727096050.28048: getting the next task for host managed_node3 7530 1727096050.28056: done getting next task for host managed_node3 7530 1727096050.28058: ^ task is: TASK: Stat profile file 7530 1727096050.28062: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096050.28067: getting variables 7530 1727096050.28071: in VariableManager get_vars() 7530 1727096050.28120: Calling all_inventory to load vars for managed_node3 7530 1727096050.28123: Calling groups_inventory to load vars for managed_node3 7530 1727096050.28125: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.28139: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.28141: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.28144: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.28963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.29846: done with get_vars() 7530 1727096050.29873: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:54:10 -0400 (0:00:00.036) 0:00:41.087 ****** 7530 1727096050.29949: entering _queue_task() for managed_node3/stat 7530 1727096050.30221: worker is 1 (out of 1 available) 7530 1727096050.30236: exiting _queue_task() for managed_node3/stat 7530 1727096050.30250: done queuing things up, now waiting for results queue to drain 7530 1727096050.30252: waiting for pending results... 7530 1727096050.30435: running TaskExecutor() for managed_node3/TASK: Stat profile file 7530 1727096050.30511: in run() - task 0afff68d-5257-086b-f4f0-000000001960 7530 1727096050.30521: variable 'ansible_search_path' from source: unknown 7530 1727096050.30525: variable 'ansible_search_path' from source: unknown 7530 1727096050.30556: calling self._execute() 7530 1727096050.30640: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.30643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.30653: variable 'omit' from source: magic vars 7530 1727096050.30936: variable 'ansible_distribution_major_version' from source: facts 7530 1727096050.30944: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096050.30950: variable 'omit' from source: magic vars 7530 1727096050.30985: variable 'omit' from source: magic vars 7530 1727096050.31059: variable 'profile' from source: include params 7530 1727096050.31063: variable 'interface' from source: play vars 7530 1727096050.31109: variable 'interface' from source: play vars 7530 1727096050.31124: variable 'omit' from source: magic vars 7530 1727096050.31166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096050.31195: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096050.31211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096050.31224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.31244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.31262: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096050.31266: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.31270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.31341: Set connection var ansible_pipelining to False 7530 1727096050.31346: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096050.31349: Set connection var ansible_timeout to 10 7530 1727096050.31378: Set connection var ansible_shell_executable to /bin/sh 7530 1727096050.31382: Set connection var ansible_shell_type to sh 7530 1727096050.31385: Set connection var ansible_connection to ssh 7530 1727096050.31388: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.31390: variable 'ansible_connection' from source: unknown 7530 1727096050.31392: variable 'ansible_module_compression' from source: unknown 7530 1727096050.31394: variable 'ansible_shell_type' from source: unknown 7530 1727096050.31396: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.31398: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.31400: variable 'ansible_pipelining' from source: unknown 7530 1727096050.31403: variable 'ansible_timeout' from source: unknown 7530 1727096050.31405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.31559: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096050.31565: variable 'omit' from source: magic vars 7530 1727096050.31573: starting attempt loop 7530 1727096050.31581: running the handler 7530 1727096050.31595: _low_level_execute_command(): starting 7530 1727096050.31604: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096050.32136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.32141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.32145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.32192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.32195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.32202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.32245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.33936: stdout chunk (state=3): >>>/root <<< 7530 1727096050.34028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.34070: stderr chunk (state=3): >>><<< 7530 1727096050.34074: stdout chunk (state=3): >>><<< 7530 1727096050.34095: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.34107: _low_level_execute_command(): starting 7530 1727096050.34114: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435 `" && echo ansible-tmp-1727096050.3409567-9059-62288709259435="` echo /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435 `" ) && sleep 0' 7530 1727096050.34585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.34589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.34601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.34606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096050.34609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.34654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.34657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.34663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.34698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.36708: stdout chunk (state=3): >>>ansible-tmp-1727096050.3409567-9059-62288709259435=/root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435 <<< 7530 1727096050.36809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.36842: stderr chunk (state=3): >>><<< 7530 1727096050.36845: stdout chunk (state=3): >>><<< 7530 1727096050.36863: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096050.3409567-9059-62288709259435=/root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.36909: variable 'ansible_module_compression' from source: unknown 7530 1727096050.36956: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7530 1727096050.36991: variable 'ansible_facts' from source: unknown 7530 1727096050.37054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/AnsiballZ_stat.py 7530 1727096050.37164: Sending initial data 7530 1727096050.37169: Sent initial data (150 bytes) 7530 1727096050.37653: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.37658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.37661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.37712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.37715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.37718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.37762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.39487: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096050.39501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096050.39540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpwd0tv3r8 /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/AnsiballZ_stat.py <<< 7530 1727096050.39545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/AnsiballZ_stat.py" <<< 7530 1727096050.39572: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 7530 1727096050.39574: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpwd0tv3r8" to remote "/root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/AnsiballZ_stat.py" <<< 7530 1727096050.40079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.40121: stderr chunk (state=3): >>><<< 7530 1727096050.40126: stdout chunk (state=3): >>><<< 7530 1727096050.40155: done transferring module to remote 7530 1727096050.40165: _low_level_execute_command(): starting 7530 1727096050.40170: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/ /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/AnsiballZ_stat.py && sleep 0' 7530 1727096050.40631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.40638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.40640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.40642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096050.40651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.40696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.40699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.40706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.40742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.42650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.42672: stderr chunk (state=3): >>><<< 7530 1727096050.42676: stdout chunk (state=3): >>><<< 7530 1727096050.42692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.42695: _low_level_execute_command(): starting 7530 1727096050.42701: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/AnsiballZ_stat.py && sleep 0' 7530 1727096050.43154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.43158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096050.43189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.43192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.43195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096050.43197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.43255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.43258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.43264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.43308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.59564: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7530 1727096050.61093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096050.61119: stderr chunk (state=3): >>><<< 7530 1727096050.61122: stdout chunk (state=3): >>><<< 7530 1727096050.61139: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096050.61163: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096050.61174: _low_level_execute_command(): starting 7530 1727096050.61178: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096050.3409567-9059-62288709259435/ > /dev/null 2>&1 && sleep 0' 7530 1727096050.61654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.61658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.61661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096050.61663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.61712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.61716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.61718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.61763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.63671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.63698: stderr chunk (state=3): >>><<< 7530 1727096050.63701: stdout chunk (state=3): >>><<< 7530 1727096050.63718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.63724: handler run complete 7530 1727096050.63741: attempt loop complete, returning result 7530 1727096050.63744: _execute() done 7530 1727096050.63747: dumping result to json 7530 1727096050.63749: done dumping result, returning 7530 1727096050.63756: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-086b-f4f0-000000001960] 7530 1727096050.63761: sending task result for task 0afff68d-5257-086b-f4f0-000000001960 7530 1727096050.63862: done sending task result for task 0afff68d-5257-086b-f4f0-000000001960 7530 1727096050.63865: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7530 1727096050.63926: no more pending results, returning what we have 7530 1727096050.63929: results queue empty 7530 1727096050.63930: checking for any_errors_fatal 7530 1727096050.63944: done checking for any_errors_fatal 7530 1727096050.63944: checking for max_fail_percentage 7530 1727096050.63946: done checking for max_fail_percentage 7530 1727096050.63947: checking to see if all hosts have failed and the running result is not ok 7530 1727096050.63948: done checking to see if all hosts have failed 7530 1727096050.63948: getting the remaining hosts for this loop 7530 1727096050.63950: done getting the remaining hosts for this loop 7530 1727096050.63954: getting the next task for host managed_node3 7530 1727096050.63961: done getting next task for host managed_node3 7530 1727096050.63964: ^ task is: TASK: Set NM profile exist flag based on the profile files 7530 1727096050.63969: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096050.63973: getting variables 7530 1727096050.63975: in VariableManager get_vars() 7530 1727096050.64029: Calling all_inventory to load vars for managed_node3 7530 1727096050.64032: Calling groups_inventory to load vars for managed_node3 7530 1727096050.64036: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.64048: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.64050: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.64053: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.65003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.65886: done with get_vars() 7530 1727096050.65911: done getting variables 7530 1727096050.65963: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:54:10 -0400 (0:00:00.360) 0:00:41.448 ****** 7530 1727096050.65990: entering _queue_task() for managed_node3/set_fact 7530 1727096050.66259: worker is 1 (out of 1 available) 7530 1727096050.66275: exiting _queue_task() for managed_node3/set_fact 7530 1727096050.66288: done queuing things up, now waiting for results queue to drain 7530 1727096050.66290: waiting for pending results... 7530 1727096050.66474: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7530 1727096050.66560: in run() - task 0afff68d-5257-086b-f4f0-000000001961 7530 1727096050.66571: variable 'ansible_search_path' from source: unknown 7530 1727096050.66576: variable 'ansible_search_path' from source: unknown 7530 1727096050.66604: calling self._execute() 7530 1727096050.66687: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.66691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.66701: variable 'omit' from source: magic vars 7530 1727096050.66990: variable 'ansible_distribution_major_version' from source: facts 7530 1727096050.67003: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096050.67091: variable 'profile_stat' from source: set_fact 7530 1727096050.67102: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096050.67106: when evaluation is False, skipping this task 7530 1727096050.67109: _execute() done 7530 1727096050.67112: dumping result to json 7530 1727096050.67114: done dumping result, returning 7530 1727096050.67120: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-086b-f4f0-000000001961] 7530 1727096050.67125: sending task result for task 0afff68d-5257-086b-f4f0-000000001961 7530 1727096050.67220: done sending task result for task 0afff68d-5257-086b-f4f0-000000001961 7530 1727096050.67223: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096050.67274: no more pending results, returning what we have 7530 1727096050.67279: results queue empty 7530 1727096050.67280: checking for any_errors_fatal 7530 1727096050.67290: done checking for any_errors_fatal 7530 1727096050.67291: checking for max_fail_percentage 7530 1727096050.67293: done checking for max_fail_percentage 7530 1727096050.67294: checking to see if all hosts have failed and the running result is not ok 7530 1727096050.67295: done checking to see if all hosts have failed 7530 1727096050.67295: getting the remaining hosts for this loop 7530 1727096050.67297: done getting the remaining hosts for this loop 7530 1727096050.67300: getting the next task for host managed_node3 7530 1727096050.67308: done getting next task for host managed_node3 7530 1727096050.67312: ^ task is: TASK: Get NM profile info 7530 1727096050.67315: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096050.67319: getting variables 7530 1727096050.67321: in VariableManager get_vars() 7530 1727096050.67373: Calling all_inventory to load vars for managed_node3 7530 1727096050.67375: Calling groups_inventory to load vars for managed_node3 7530 1727096050.67378: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096050.67389: Calling all_plugins_play to load vars for managed_node3 7530 1727096050.67391: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096050.67394: Calling groups_plugins_play to load vars for managed_node3 7530 1727096050.68216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096050.69238: done with get_vars() 7530 1727096050.69258: done getting variables 7530 1727096050.69307: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:54:10 -0400 (0:00:00.033) 0:00:41.481 ****** 7530 1727096050.69336: entering _queue_task() for managed_node3/shell 7530 1727096050.69611: worker is 1 (out of 1 available) 7530 1727096050.69627: exiting _queue_task() for managed_node3/shell 7530 1727096050.69643: done queuing things up, now waiting for results queue to drain 7530 1727096050.69645: waiting for pending results... 7530 1727096050.69830: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7530 1727096050.69919: in run() - task 0afff68d-5257-086b-f4f0-000000001962 7530 1727096050.69934: variable 'ansible_search_path' from source: unknown 7530 1727096050.69939: variable 'ansible_search_path' from source: unknown 7530 1727096050.69966: calling self._execute() 7530 1727096050.70049: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.70054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.70062: variable 'omit' from source: magic vars 7530 1727096050.70359: variable 'ansible_distribution_major_version' from source: facts 7530 1727096050.70371: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096050.70377: variable 'omit' from source: magic vars 7530 1727096050.70409: variable 'omit' from source: magic vars 7530 1727096050.70488: variable 'profile' from source: include params 7530 1727096050.70492: variable 'interface' from source: play vars 7530 1727096050.70540: variable 'interface' from source: play vars 7530 1727096050.70556: variable 'omit' from source: magic vars 7530 1727096050.70594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096050.70622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096050.70643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096050.70656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.70665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096050.70691: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096050.70694: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.70699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.70772: Set connection var ansible_pipelining to False 7530 1727096050.70778: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096050.70783: Set connection var ansible_timeout to 10 7530 1727096050.70791: Set connection var ansible_shell_executable to /bin/sh 7530 1727096050.70794: Set connection var ansible_shell_type to sh 7530 1727096050.70796: Set connection var ansible_connection to ssh 7530 1727096050.70816: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.70819: variable 'ansible_connection' from source: unknown 7530 1727096050.70822: variable 'ansible_module_compression' from source: unknown 7530 1727096050.70824: variable 'ansible_shell_type' from source: unknown 7530 1727096050.70826: variable 'ansible_shell_executable' from source: unknown 7530 1727096050.70828: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096050.70835: variable 'ansible_pipelining' from source: unknown 7530 1727096050.70838: variable 'ansible_timeout' from source: unknown 7530 1727096050.70840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096050.70946: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096050.70956: variable 'omit' from source: magic vars 7530 1727096050.70962: starting attempt loop 7530 1727096050.70964: running the handler 7530 1727096050.70978: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096050.70995: _low_level_execute_command(): starting 7530 1727096050.71002: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096050.71546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.71551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.71605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.71609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.71611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.71659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.73417: stdout chunk (state=3): >>>/root <<< 7530 1727096050.73508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.73542: stderr chunk (state=3): >>><<< 7530 1727096050.73545: stdout chunk (state=3): >>><<< 7530 1727096050.73571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.73588: _low_level_execute_command(): starting 7530 1727096050.73592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301 `" && echo ansible-tmp-1727096050.7357082-9068-51310022589301="` echo /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301 `" ) && sleep 0' 7530 1727096050.74062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.74076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096050.74080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096050.74082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.74126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.74129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.74131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.74176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.76223: stdout chunk (state=3): >>>ansible-tmp-1727096050.7357082-9068-51310022589301=/root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301 <<< 7530 1727096050.76328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.76360: stderr chunk (state=3): >>><<< 7530 1727096050.76363: stdout chunk (state=3): >>><<< 7530 1727096050.76385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096050.7357082-9068-51310022589301=/root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.76416: variable 'ansible_module_compression' from source: unknown 7530 1727096050.76459: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096050.76491: variable 'ansible_facts' from source: unknown 7530 1727096050.76548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/AnsiballZ_command.py 7530 1727096050.76664: Sending initial data 7530 1727096050.76669: Sent initial data (153 bytes) 7530 1727096050.77124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.77128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.77131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.77135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.77193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.77196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.77202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.77242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.78909: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096050.78938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096050.78974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmppzm788w6 /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/AnsiballZ_command.py <<< 7530 1727096050.78978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/AnsiballZ_command.py" <<< 7530 1727096050.79007: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmppzm788w6" to remote "/root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/AnsiballZ_command.py" <<< 7530 1727096050.79010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/AnsiballZ_command.py" <<< 7530 1727096050.79519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.79569: stderr chunk (state=3): >>><<< 7530 1727096050.79573: stdout chunk (state=3): >>><<< 7530 1727096050.79612: done transferring module to remote 7530 1727096050.79622: _low_level_execute_command(): starting 7530 1727096050.79627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/ /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/AnsiballZ_command.py && sleep 0' 7530 1727096050.80109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.80113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096050.80123: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096050.80125: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096050.80171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.80175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.80177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.80221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096050.82287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096050.82292: stdout chunk (state=3): >>><<< 7530 1727096050.82295: stderr chunk (state=3): >>><<< 7530 1727096050.82298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096050.82300: _low_level_execute_command(): starting 7530 1727096050.82303: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/AnsiballZ_command.py && sleep 0' 7530 1727096050.82954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096050.82975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096050.82993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096050.83013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096050.83122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096050.83137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096050.83155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096050.83243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096051.02127: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-23 08:54:10.999957", "end": "2024-09-23 08:54:11.018682", "delta": "0:00:00.018725", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096051.04023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096051.04051: stderr chunk (state=3): >>><<< 7530 1727096051.04055: stdout chunk (state=3): >>><<< 7530 1727096051.04079: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-23 08:54:10.999957", "end": "2024-09-23 08:54:11.018682", "delta": "0:00:00.018725", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096051.04107: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096051.04115: _low_level_execute_command(): starting 7530 1727096051.04123: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096050.7357082-9068-51310022589301/ > /dev/null 2>&1 && sleep 0' 7530 1727096051.04675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096051.04702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096051.04738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096051.06778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096051.06798: stderr chunk (state=3): >>><<< 7530 1727096051.06810: stdout chunk (state=3): >>><<< 7530 1727096051.06835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096051.06873: handler run complete 7530 1727096051.06886: Evaluated conditional (False): False 7530 1727096051.06972: attempt loop complete, returning result 7530 1727096051.06975: _execute() done 7530 1727096051.06978: dumping result to json 7530 1727096051.06980: done dumping result, returning 7530 1727096051.06982: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-086b-f4f0-000000001962] 7530 1727096051.06985: sending task result for task 0afff68d-5257-086b-f4f0-000000001962 7530 1727096051.07064: done sending task result for task 0afff68d-5257-086b-f4f0-000000001962 7530 1727096051.07069: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.018725", "end": "2024-09-23 08:54:11.018682", "rc": 0, "start": "2024-09-23 08:54:10.999957" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7530 1727096051.07148: no more pending results, returning what we have 7530 1727096051.07152: results queue empty 7530 1727096051.07153: checking for any_errors_fatal 7530 1727096051.07161: done checking for any_errors_fatal 7530 1727096051.07162: checking for max_fail_percentage 7530 1727096051.07164: done checking for max_fail_percentage 7530 1727096051.07165: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.07166: done checking to see if all hosts have failed 7530 1727096051.07169: getting the remaining hosts for this loop 7530 1727096051.07171: done getting the remaining hosts for this loop 7530 1727096051.07175: getting the next task for host managed_node3 7530 1727096051.07184: done getting next task for host managed_node3 7530 1727096051.07186: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7530 1727096051.07191: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.07196: getting variables 7530 1727096051.07197: in VariableManager get_vars() 7530 1727096051.07252: Calling all_inventory to load vars for managed_node3 7530 1727096051.07255: Calling groups_inventory to load vars for managed_node3 7530 1727096051.07258: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.07476: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.07488: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.07493: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.09086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.10714: done with get_vars() 7530 1727096051.10755: done getting variables 7530 1727096051.10818: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:54:11 -0400 (0:00:00.415) 0:00:41.896 ****** 7530 1727096051.10855: entering _queue_task() for managed_node3/set_fact 7530 1727096051.11317: worker is 1 (out of 1 available) 7530 1727096051.11331: exiting _queue_task() for managed_node3/set_fact 7530 1727096051.11344: done queuing things up, now waiting for results queue to drain 7530 1727096051.11346: waiting for pending results... 7530 1727096051.11692: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7530 1727096051.11731: in run() - task 0afff68d-5257-086b-f4f0-000000001963 7530 1727096051.11796: variable 'ansible_search_path' from source: unknown 7530 1727096051.11800: variable 'ansible_search_path' from source: unknown 7530 1727096051.11813: calling self._execute() 7530 1727096051.11930: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.11943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.11960: variable 'omit' from source: magic vars 7530 1727096051.12375: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.12450: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.12540: variable 'nm_profile_exists' from source: set_fact 7530 1727096051.12572: Evaluated conditional (nm_profile_exists.rc == 0): True 7530 1727096051.12584: variable 'omit' from source: magic vars 7530 1727096051.12640: variable 'omit' from source: magic vars 7530 1727096051.12689: variable 'omit' from source: magic vars 7530 1727096051.12738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096051.12787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096051.12886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096051.12889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.12891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.12893: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096051.12895: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.12901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.12997: Set connection var ansible_pipelining to False 7530 1727096051.13012: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096051.13021: Set connection var ansible_timeout to 10 7530 1727096051.13033: Set connection var ansible_shell_executable to /bin/sh 7530 1727096051.13039: Set connection var ansible_shell_type to sh 7530 1727096051.13044: Set connection var ansible_connection to ssh 7530 1727096051.13073: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.13101: variable 'ansible_connection' from source: unknown 7530 1727096051.13104: variable 'ansible_module_compression' from source: unknown 7530 1727096051.13107: variable 'ansible_shell_type' from source: unknown 7530 1727096051.13109: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.13114: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.13209: variable 'ansible_pipelining' from source: unknown 7530 1727096051.13212: variable 'ansible_timeout' from source: unknown 7530 1727096051.13215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.13287: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096051.13302: variable 'omit' from source: magic vars 7530 1727096051.13310: starting attempt loop 7530 1727096051.13316: running the handler 7530 1727096051.13339: handler run complete 7530 1727096051.13354: attempt loop complete, returning result 7530 1727096051.13360: _execute() done 7530 1727096051.13365: dumping result to json 7530 1727096051.13373: done dumping result, returning 7530 1727096051.13443: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-086b-f4f0-000000001963] 7530 1727096051.13446: sending task result for task 0afff68d-5257-086b-f4f0-000000001963 7530 1727096051.13515: done sending task result for task 0afff68d-5257-086b-f4f0-000000001963 7530 1727096051.13519: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7530 1727096051.13604: no more pending results, returning what we have 7530 1727096051.13608: results queue empty 7530 1727096051.13609: checking for any_errors_fatal 7530 1727096051.13617: done checking for any_errors_fatal 7530 1727096051.13618: checking for max_fail_percentage 7530 1727096051.13620: done checking for max_fail_percentage 7530 1727096051.13621: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.13622: done checking to see if all hosts have failed 7530 1727096051.13623: getting the remaining hosts for this loop 7530 1727096051.13624: done getting the remaining hosts for this loop 7530 1727096051.13628: getting the next task for host managed_node3 7530 1727096051.13637: done getting next task for host managed_node3 7530 1727096051.13640: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7530 1727096051.13645: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.13649: getting variables 7530 1727096051.13870: in VariableManager get_vars() 7530 1727096051.13918: Calling all_inventory to load vars for managed_node3 7530 1727096051.13921: Calling groups_inventory to load vars for managed_node3 7530 1727096051.13923: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.13934: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.13936: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.13940: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.15561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.17229: done with get_vars() 7530 1727096051.17265: done getting variables 7530 1727096051.17334: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096051.17463: variable 'profile' from source: include params 7530 1727096051.17469: variable 'interface' from source: play vars 7530 1727096051.17528: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:54:11 -0400 (0:00:00.067) 0:00:41.964 ****** 7530 1727096051.17576: entering _queue_task() for managed_node3/command 7530 1727096051.18028: worker is 1 (out of 1 available) 7530 1727096051.18039: exiting _queue_task() for managed_node3/command 7530 1727096051.18053: done queuing things up, now waiting for results queue to drain 7530 1727096051.18054: waiting for pending results... 7530 1727096051.18331: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7530 1727096051.18430: in run() - task 0afff68d-5257-086b-f4f0-000000001965 7530 1727096051.18473: variable 'ansible_search_path' from source: unknown 7530 1727096051.18477: variable 'ansible_search_path' from source: unknown 7530 1727096051.18503: calling self._execute() 7530 1727096051.18620: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.18643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.18650: variable 'omit' from source: magic vars 7530 1727096051.19077: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.19089: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.19274: variable 'profile_stat' from source: set_fact 7530 1727096051.19278: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096051.19280: when evaluation is False, skipping this task 7530 1727096051.19282: _execute() done 7530 1727096051.19289: dumping result to json 7530 1727096051.19292: done dumping result, returning 7530 1727096051.19295: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000001965] 7530 1727096051.19298: sending task result for task 0afff68d-5257-086b-f4f0-000000001965 7530 1727096051.19457: done sending task result for task 0afff68d-5257-086b-f4f0-000000001965 7530 1727096051.19460: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096051.19687: no more pending results, returning what we have 7530 1727096051.19690: results queue empty 7530 1727096051.19691: checking for any_errors_fatal 7530 1727096051.19698: done checking for any_errors_fatal 7530 1727096051.19699: checking for max_fail_percentage 7530 1727096051.19701: done checking for max_fail_percentage 7530 1727096051.19701: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.19702: done checking to see if all hosts have failed 7530 1727096051.19703: getting the remaining hosts for this loop 7530 1727096051.19705: done getting the remaining hosts for this loop 7530 1727096051.19708: getting the next task for host managed_node3 7530 1727096051.19715: done getting next task for host managed_node3 7530 1727096051.19717: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7530 1727096051.19722: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.19725: getting variables 7530 1727096051.19727: in VariableManager get_vars() 7530 1727096051.19781: Calling all_inventory to load vars for managed_node3 7530 1727096051.19784: Calling groups_inventory to load vars for managed_node3 7530 1727096051.19787: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.19800: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.19803: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.19806: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.21238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.22902: done with get_vars() 7530 1727096051.22945: done getting variables 7530 1727096051.23013: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096051.23142: variable 'profile' from source: include params 7530 1727096051.23146: variable 'interface' from source: play vars 7530 1727096051.23208: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:54:11 -0400 (0:00:00.056) 0:00:42.020 ****** 7530 1727096051.23248: entering _queue_task() for managed_node3/set_fact 7530 1727096051.23716: worker is 1 (out of 1 available) 7530 1727096051.23730: exiting _queue_task() for managed_node3/set_fact 7530 1727096051.23742: done queuing things up, now waiting for results queue to drain 7530 1727096051.23744: waiting for pending results... 7530 1727096051.24092: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7530 1727096051.24109: in run() - task 0afff68d-5257-086b-f4f0-000000001966 7530 1727096051.24128: variable 'ansible_search_path' from source: unknown 7530 1727096051.24133: variable 'ansible_search_path' from source: unknown 7530 1727096051.24176: calling self._execute() 7530 1727096051.24299: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.24302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.24408: variable 'omit' from source: magic vars 7530 1727096051.24697: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.24718: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.24864: variable 'profile_stat' from source: set_fact 7530 1727096051.24889: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096051.24898: when evaluation is False, skipping this task 7530 1727096051.24906: _execute() done 7530 1727096051.24914: dumping result to json 7530 1727096051.24921: done dumping result, returning 7530 1727096051.24933: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000001966] 7530 1727096051.24949: sending task result for task 0afff68d-5257-086b-f4f0-000000001966 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096051.25226: no more pending results, returning what we have 7530 1727096051.25230: results queue empty 7530 1727096051.25232: checking for any_errors_fatal 7530 1727096051.25241: done checking for any_errors_fatal 7530 1727096051.25242: checking for max_fail_percentage 7530 1727096051.25244: done checking for max_fail_percentage 7530 1727096051.25245: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.25247: done checking to see if all hosts have failed 7530 1727096051.25248: getting the remaining hosts for this loop 7530 1727096051.25249: done getting the remaining hosts for this loop 7530 1727096051.25254: getting the next task for host managed_node3 7530 1727096051.25262: done getting next task for host managed_node3 7530 1727096051.25265: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7530 1727096051.25272: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.25279: getting variables 7530 1727096051.25281: in VariableManager get_vars() 7530 1727096051.25338: Calling all_inventory to load vars for managed_node3 7530 1727096051.25342: Calling groups_inventory to load vars for managed_node3 7530 1727096051.25345: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.25359: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.25362: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.25366: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.26105: done sending task result for task 0afff68d-5257-086b-f4f0-000000001966 7530 1727096051.26109: WORKER PROCESS EXITING 7530 1727096051.27196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.28848: done with get_vars() 7530 1727096051.28888: done getting variables 7530 1727096051.28944: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096051.29061: variable 'profile' from source: include params 7530 1727096051.29065: variable 'interface' from source: play vars 7530 1727096051.29130: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:54:11 -0400 (0:00:00.059) 0:00:42.079 ****** 7530 1727096051.29163: entering _queue_task() for managed_node3/command 7530 1727096051.29685: worker is 1 (out of 1 available) 7530 1727096051.29696: exiting _queue_task() for managed_node3/command 7530 1727096051.29708: done queuing things up, now waiting for results queue to drain 7530 1727096051.29710: waiting for pending results... 7530 1727096051.29866: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7530 1727096051.30006: in run() - task 0afff68d-5257-086b-f4f0-000000001967 7530 1727096051.30025: variable 'ansible_search_path' from source: unknown 7530 1727096051.30032: variable 'ansible_search_path' from source: unknown 7530 1727096051.30081: calling self._execute() 7530 1727096051.30193: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.30204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.30220: variable 'omit' from source: magic vars 7530 1727096051.30614: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.30632: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.30769: variable 'profile_stat' from source: set_fact 7530 1727096051.30789: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096051.30797: when evaluation is False, skipping this task 7530 1727096051.30810: _execute() done 7530 1727096051.30919: dumping result to json 7530 1727096051.30922: done dumping result, returning 7530 1727096051.30925: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000001967] 7530 1727096051.30928: sending task result for task 0afff68d-5257-086b-f4f0-000000001967 7530 1727096051.30999: done sending task result for task 0afff68d-5257-086b-f4f0-000000001967 7530 1727096051.31002: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096051.31074: no more pending results, returning what we have 7530 1727096051.31079: results queue empty 7530 1727096051.31080: checking for any_errors_fatal 7530 1727096051.31091: done checking for any_errors_fatal 7530 1727096051.31092: checking for max_fail_percentage 7530 1727096051.31093: done checking for max_fail_percentage 7530 1727096051.31095: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.31096: done checking to see if all hosts have failed 7530 1727096051.31096: getting the remaining hosts for this loop 7530 1727096051.31098: done getting the remaining hosts for this loop 7530 1727096051.31102: getting the next task for host managed_node3 7530 1727096051.31109: done getting next task for host managed_node3 7530 1727096051.31112: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7530 1727096051.31116: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.31121: getting variables 7530 1727096051.31123: in VariableManager get_vars() 7530 1727096051.31178: Calling all_inventory to load vars for managed_node3 7530 1727096051.31181: Calling groups_inventory to load vars for managed_node3 7530 1727096051.31184: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.31198: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.31201: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.31204: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.32827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.34734: done with get_vars() 7530 1727096051.34775: done getting variables 7530 1727096051.34866: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096051.35022: variable 'profile' from source: include params 7530 1727096051.35027: variable 'interface' from source: play vars 7530 1727096051.35096: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:54:11 -0400 (0:00:00.059) 0:00:42.139 ****** 7530 1727096051.35139: entering _queue_task() for managed_node3/set_fact 7530 1727096051.35701: worker is 1 (out of 1 available) 7530 1727096051.35716: exiting _queue_task() for managed_node3/set_fact 7530 1727096051.35736: done queuing things up, now waiting for results queue to drain 7530 1727096051.35740: waiting for pending results... 7530 1727096051.35986: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7530 1727096051.36185: in run() - task 0afff68d-5257-086b-f4f0-000000001968 7530 1727096051.36193: variable 'ansible_search_path' from source: unknown 7530 1727096051.36197: variable 'ansible_search_path' from source: unknown 7530 1727096051.36272: calling self._execute() 7530 1727096051.36387: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.36441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.36445: variable 'omit' from source: magic vars 7530 1727096051.36883: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.36906: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.37050: variable 'profile_stat' from source: set_fact 7530 1727096051.37097: Evaluated conditional (profile_stat.stat.exists): False 7530 1727096051.37100: when evaluation is False, skipping this task 7530 1727096051.37206: _execute() done 7530 1727096051.37209: dumping result to json 7530 1727096051.37213: done dumping result, returning 7530 1727096051.37216: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0afff68d-5257-086b-f4f0-000000001968] 7530 1727096051.37218: sending task result for task 0afff68d-5257-086b-f4f0-000000001968 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7530 1727096051.37449: no more pending results, returning what we have 7530 1727096051.37456: results queue empty 7530 1727096051.37459: checking for any_errors_fatal 7530 1727096051.37474: done checking for any_errors_fatal 7530 1727096051.37475: checking for max_fail_percentage 7530 1727096051.37477: done checking for max_fail_percentage 7530 1727096051.37478: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.37479: done checking to see if all hosts have failed 7530 1727096051.37480: getting the remaining hosts for this loop 7530 1727096051.37482: done getting the remaining hosts for this loop 7530 1727096051.37485: getting the next task for host managed_node3 7530 1727096051.37495: done getting next task for host managed_node3 7530 1727096051.37498: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7530 1727096051.37502: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.37508: getting variables 7530 1727096051.37510: in VariableManager get_vars() 7530 1727096051.37797: Calling all_inventory to load vars for managed_node3 7530 1727096051.37801: Calling groups_inventory to load vars for managed_node3 7530 1727096051.37803: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.37822: done sending task result for task 0afff68d-5257-086b-f4f0-000000001968 7530 1727096051.37827: WORKER PROCESS EXITING 7530 1727096051.37842: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.37846: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.37849: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.39849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.41428: done with get_vars() 7530 1727096051.41469: done getting variables 7530 1727096051.41539: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096051.41669: variable 'profile' from source: include params 7530 1727096051.41673: variable 'interface' from source: play vars 7530 1727096051.41737: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:54:11 -0400 (0:00:00.066) 0:00:42.206 ****** 7530 1727096051.41771: entering _queue_task() for managed_node3/assert 7530 1727096051.42149: worker is 1 (out of 1 available) 7530 1727096051.42274: exiting _queue_task() for managed_node3/assert 7530 1727096051.42288: done queuing things up, now waiting for results queue to drain 7530 1727096051.42290: waiting for pending results... 7530 1727096051.42521: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7530 1727096051.42659: in run() - task 0afff68d-5257-086b-f4f0-0000000016d3 7530 1727096051.42709: variable 'ansible_search_path' from source: unknown 7530 1727096051.42713: variable 'ansible_search_path' from source: unknown 7530 1727096051.42737: calling self._execute() 7530 1727096051.42861: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.42926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.42930: variable 'omit' from source: magic vars 7530 1727096051.43327: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.43352: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.43370: variable 'omit' from source: magic vars 7530 1727096051.43474: variable 'omit' from source: magic vars 7530 1727096051.43562: variable 'profile' from source: include params 7530 1727096051.43579: variable 'interface' from source: play vars 7530 1727096051.43656: variable 'interface' from source: play vars 7530 1727096051.43690: variable 'omit' from source: magic vars 7530 1727096051.43746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096051.43795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096051.43826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096051.43902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.43905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.43917: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096051.43927: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.43940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.44077: Set connection var ansible_pipelining to False 7530 1727096051.44093: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096051.44106: Set connection var ansible_timeout to 10 7530 1727096051.44137: Set connection var ansible_shell_executable to /bin/sh 7530 1727096051.44179: Set connection var ansible_shell_type to sh 7530 1727096051.44182: Set connection var ansible_connection to ssh 7530 1727096051.44245: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.44249: variable 'ansible_connection' from source: unknown 7530 1727096051.44252: variable 'ansible_module_compression' from source: unknown 7530 1727096051.44254: variable 'ansible_shell_type' from source: unknown 7530 1727096051.44257: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.44258: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.44263: variable 'ansible_pipelining' from source: unknown 7530 1727096051.44266: variable 'ansible_timeout' from source: unknown 7530 1727096051.44285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.44540: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096051.44576: variable 'omit' from source: magic vars 7530 1727096051.44581: starting attempt loop 7530 1727096051.44584: running the handler 7530 1727096051.44878: variable 'lsr_net_profile_exists' from source: set_fact 7530 1727096051.44882: Evaluated conditional (lsr_net_profile_exists): True 7530 1727096051.44884: handler run complete 7530 1727096051.44887: attempt loop complete, returning result 7530 1727096051.44889: _execute() done 7530 1727096051.44892: dumping result to json 7530 1727096051.44896: done dumping result, returning 7530 1727096051.44898: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [0afff68d-5257-086b-f4f0-0000000016d3] 7530 1727096051.44900: sending task result for task 0afff68d-5257-086b-f4f0-0000000016d3 7530 1727096051.45012: done sending task result for task 0afff68d-5257-086b-f4f0-0000000016d3 7530 1727096051.45016: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096051.45238: no more pending results, returning what we have 7530 1727096051.45243: results queue empty 7530 1727096051.45244: checking for any_errors_fatal 7530 1727096051.45256: done checking for any_errors_fatal 7530 1727096051.45257: checking for max_fail_percentage 7530 1727096051.45262: done checking for max_fail_percentage 7530 1727096051.45263: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.45264: done checking to see if all hosts have failed 7530 1727096051.45265: getting the remaining hosts for this loop 7530 1727096051.45266: done getting the remaining hosts for this loop 7530 1727096051.45272: getting the next task for host managed_node3 7530 1727096051.45280: done getting next task for host managed_node3 7530 1727096051.45283: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7530 1727096051.45287: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.45291: getting variables 7530 1727096051.45293: in VariableManager get_vars() 7530 1727096051.45353: Calling all_inventory to load vars for managed_node3 7530 1727096051.45356: Calling groups_inventory to load vars for managed_node3 7530 1727096051.45359: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.45493: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.45497: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.45501: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.47076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.55723: done with get_vars() 7530 1727096051.55757: done getting variables 7530 1727096051.55814: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096051.55922: variable 'profile' from source: include params 7530 1727096051.55926: variable 'interface' from source: play vars 7530 1727096051.55989: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:54:11 -0400 (0:00:00.142) 0:00:42.348 ****** 7530 1727096051.56023: entering _queue_task() for managed_node3/assert 7530 1727096051.56369: worker is 1 (out of 1 available) 7530 1727096051.56381: exiting _queue_task() for managed_node3/assert 7530 1727096051.56394: done queuing things up, now waiting for results queue to drain 7530 1727096051.56396: waiting for pending results... 7530 1727096051.56900: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7530 1727096051.56906: in run() - task 0afff68d-5257-086b-f4f0-0000000016d4 7530 1727096051.56911: variable 'ansible_search_path' from source: unknown 7530 1727096051.56915: variable 'ansible_search_path' from source: unknown 7530 1727096051.56918: calling self._execute() 7530 1727096051.57013: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.57028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.57046: variable 'omit' from source: magic vars 7530 1727096051.57460: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.57483: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.57495: variable 'omit' from source: magic vars 7530 1727096051.57548: variable 'omit' from source: magic vars 7530 1727096051.57674: variable 'profile' from source: include params 7530 1727096051.57685: variable 'interface' from source: play vars 7530 1727096051.57758: variable 'interface' from source: play vars 7530 1727096051.57787: variable 'omit' from source: magic vars 7530 1727096051.57862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096051.57891: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096051.57918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096051.57972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.57976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.58001: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096051.58009: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.58015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.58144: Set connection var ansible_pipelining to False 7530 1727096051.58190: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096051.58193: Set connection var ansible_timeout to 10 7530 1727096051.58195: Set connection var ansible_shell_executable to /bin/sh 7530 1727096051.58198: Set connection var ansible_shell_type to sh 7530 1727096051.58200: Set connection var ansible_connection to ssh 7530 1727096051.58223: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.58230: variable 'ansible_connection' from source: unknown 7530 1727096051.58242: variable 'ansible_module_compression' from source: unknown 7530 1727096051.58298: variable 'ansible_shell_type' from source: unknown 7530 1727096051.58301: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.58303: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.58305: variable 'ansible_pipelining' from source: unknown 7530 1727096051.58308: variable 'ansible_timeout' from source: unknown 7530 1727096051.58310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.58448: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096051.58469: variable 'omit' from source: magic vars 7530 1727096051.58481: starting attempt loop 7530 1727096051.58488: running the handler 7530 1727096051.58636: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7530 1727096051.58736: Evaluated conditional (lsr_net_profile_ansible_managed): True 7530 1727096051.58739: handler run complete 7530 1727096051.58742: attempt loop complete, returning result 7530 1727096051.58744: _execute() done 7530 1727096051.58747: dumping result to json 7530 1727096051.58749: done dumping result, returning 7530 1727096051.58751: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0afff68d-5257-086b-f4f0-0000000016d4] 7530 1727096051.58754: sending task result for task 0afff68d-5257-086b-f4f0-0000000016d4 7530 1727096051.58829: done sending task result for task 0afff68d-5257-086b-f4f0-0000000016d4 7530 1727096051.58869: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096051.59126: no more pending results, returning what we have 7530 1727096051.59129: results queue empty 7530 1727096051.59131: checking for any_errors_fatal 7530 1727096051.59140: done checking for any_errors_fatal 7530 1727096051.59141: checking for max_fail_percentage 7530 1727096051.59142: done checking for max_fail_percentage 7530 1727096051.59143: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.59144: done checking to see if all hosts have failed 7530 1727096051.59145: getting the remaining hosts for this loop 7530 1727096051.59147: done getting the remaining hosts for this loop 7530 1727096051.59151: getting the next task for host managed_node3 7530 1727096051.59158: done getting next task for host managed_node3 7530 1727096051.59161: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7530 1727096051.59164: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.59170: getting variables 7530 1727096051.59172: in VariableManager get_vars() 7530 1727096051.59230: Calling all_inventory to load vars for managed_node3 7530 1727096051.59236: Calling groups_inventory to load vars for managed_node3 7530 1727096051.59240: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.59253: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.59256: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.59259: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.60759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.62374: done with get_vars() 7530 1727096051.62408: done getting variables 7530 1727096051.62474: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096051.62597: variable 'profile' from source: include params 7530 1727096051.62601: variable 'interface' from source: play vars 7530 1727096051.62662: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:54:11 -0400 (0:00:00.066) 0:00:42.415 ****** 7530 1727096051.62703: entering _queue_task() for managed_node3/assert 7530 1727096051.63056: worker is 1 (out of 1 available) 7530 1727096051.63071: exiting _queue_task() for managed_node3/assert 7530 1727096051.63084: done queuing things up, now waiting for results queue to drain 7530 1727096051.63086: waiting for pending results... 7530 1727096051.63387: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7530 1727096051.63538: in run() - task 0afff68d-5257-086b-f4f0-0000000016d5 7530 1727096051.63559: variable 'ansible_search_path' from source: unknown 7530 1727096051.63569: variable 'ansible_search_path' from source: unknown 7530 1727096051.63615: calling self._execute() 7530 1727096051.63739: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.63752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.63769: variable 'omit' from source: magic vars 7530 1727096051.64175: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.64196: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.64210: variable 'omit' from source: magic vars 7530 1727096051.64256: variable 'omit' from source: magic vars 7530 1727096051.64376: variable 'profile' from source: include params 7530 1727096051.64387: variable 'interface' from source: play vars 7530 1727096051.64455: variable 'interface' from source: play vars 7530 1727096051.64488: variable 'omit' from source: magic vars 7530 1727096051.64537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096051.64578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096051.64609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096051.64631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.64651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.64690: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096051.64705: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.64872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.64875: Set connection var ansible_pipelining to False 7530 1727096051.64878: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096051.64880: Set connection var ansible_timeout to 10 7530 1727096051.64882: Set connection var ansible_shell_executable to /bin/sh 7530 1727096051.64884: Set connection var ansible_shell_type to sh 7530 1727096051.64887: Set connection var ansible_connection to ssh 7530 1727096051.64888: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.64897: variable 'ansible_connection' from source: unknown 7530 1727096051.64904: variable 'ansible_module_compression' from source: unknown 7530 1727096051.64910: variable 'ansible_shell_type' from source: unknown 7530 1727096051.64917: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.64923: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.64930: variable 'ansible_pipelining' from source: unknown 7530 1727096051.64939: variable 'ansible_timeout' from source: unknown 7530 1727096051.64947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.65113: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096051.65135: variable 'omit' from source: magic vars 7530 1727096051.65146: starting attempt loop 7530 1727096051.65154: running the handler 7530 1727096051.65274: variable 'lsr_net_profile_fingerprint' from source: set_fact 7530 1727096051.65285: Evaluated conditional (lsr_net_profile_fingerprint): True 7530 1727096051.65295: handler run complete 7530 1727096051.65315: attempt loop complete, returning result 7530 1727096051.65325: _execute() done 7530 1727096051.65337: dumping result to json 7530 1727096051.65345: done dumping result, returning 7530 1727096051.65442: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [0afff68d-5257-086b-f4f0-0000000016d5] 7530 1727096051.65445: sending task result for task 0afff68d-5257-086b-f4f0-0000000016d5 7530 1727096051.65514: done sending task result for task 0afff68d-5257-086b-f4f0-0000000016d5 7530 1727096051.65517: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096051.65595: no more pending results, returning what we have 7530 1727096051.65599: results queue empty 7530 1727096051.65600: checking for any_errors_fatal 7530 1727096051.65606: done checking for any_errors_fatal 7530 1727096051.65607: checking for max_fail_percentage 7530 1727096051.65609: done checking for max_fail_percentage 7530 1727096051.65610: checking to see if all hosts have failed and the running result is not ok 7530 1727096051.65611: done checking to see if all hosts have failed 7530 1727096051.65611: getting the remaining hosts for this loop 7530 1727096051.65613: done getting the remaining hosts for this loop 7530 1727096051.65616: getting the next task for host managed_node3 7530 1727096051.65624: done getting next task for host managed_node3 7530 1727096051.65627: ^ task is: TASK: Show ipv4 routes 7530 1727096051.65629: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096051.65636: getting variables 7530 1727096051.65638: in VariableManager get_vars() 7530 1727096051.65692: Calling all_inventory to load vars for managed_node3 7530 1727096051.65696: Calling groups_inventory to load vars for managed_node3 7530 1727096051.65698: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096051.65710: Calling all_plugins_play to load vars for managed_node3 7530 1727096051.65714: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096051.65716: Calling groups_plugins_play to load vars for managed_node3 7530 1727096051.67560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096051.69195: done with get_vars() 7530 1727096051.69228: done getting variables 7530 1727096051.69299: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:114 Monday 23 September 2024 08:54:11 -0400 (0:00:00.066) 0:00:42.481 ****** 7530 1727096051.69332: entering _queue_task() for managed_node3/command 7530 1727096051.69707: worker is 1 (out of 1 available) 7530 1727096051.69720: exiting _queue_task() for managed_node3/command 7530 1727096051.69736: done queuing things up, now waiting for results queue to drain 7530 1727096051.69738: waiting for pending results... 7530 1727096051.70091: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7530 1727096051.70201: in run() - task 0afff68d-5257-086b-f4f0-0000000000ff 7530 1727096051.70242: variable 'ansible_search_path' from source: unknown 7530 1727096051.70288: calling self._execute() 7530 1727096051.70415: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.70456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.70460: variable 'omit' from source: magic vars 7530 1727096051.70904: variable 'ansible_distribution_major_version' from source: facts 7530 1727096051.70974: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096051.70977: variable 'omit' from source: magic vars 7530 1727096051.70980: variable 'omit' from source: magic vars 7530 1727096051.71024: variable 'omit' from source: magic vars 7530 1727096051.71081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096051.71138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096051.71170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096051.71225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.71228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096051.71254: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096051.71262: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.71271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.71387: Set connection var ansible_pipelining to False 7530 1727096051.71443: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096051.71447: Set connection var ansible_timeout to 10 7530 1727096051.71449: Set connection var ansible_shell_executable to /bin/sh 7530 1727096051.71452: Set connection var ansible_shell_type to sh 7530 1727096051.71454: Set connection var ansible_connection to ssh 7530 1727096051.71479: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.71488: variable 'ansible_connection' from source: unknown 7530 1727096051.71496: variable 'ansible_module_compression' from source: unknown 7530 1727096051.71503: variable 'ansible_shell_type' from source: unknown 7530 1727096051.71552: variable 'ansible_shell_executable' from source: unknown 7530 1727096051.71555: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096051.71558: variable 'ansible_pipelining' from source: unknown 7530 1727096051.71560: variable 'ansible_timeout' from source: unknown 7530 1727096051.71563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096051.71701: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096051.71722: variable 'omit' from source: magic vars 7530 1727096051.71733: starting attempt loop 7530 1727096051.71740: running the handler 7530 1727096051.71879: _low_level_execute_command(): starting 7530 1727096051.71883: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096051.72598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096051.72677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096051.72706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096051.72770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096051.74544: stdout chunk (state=3): >>>/root <<< 7530 1727096051.74699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096051.74723: stdout chunk (state=3): >>><<< 7530 1727096051.74760: stderr chunk (state=3): >>><<< 7530 1727096051.74805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096051.74830: _low_level_execute_command(): starting 7530 1727096051.74850: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668 `" && echo ansible-tmp-1727096051.7481544-9097-161107927797668="` echo /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668 `" ) && sleep 0' 7530 1727096051.75584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096051.75656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096051.75789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096051.75834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096051.75855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096051.75918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096051.78002: stdout chunk (state=3): >>>ansible-tmp-1727096051.7481544-9097-161107927797668=/root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668 <<< 7530 1727096051.78122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096051.78145: stderr chunk (state=3): >>><<< 7530 1727096051.78154: stdout chunk (state=3): >>><<< 7530 1727096051.78184: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096051.7481544-9097-161107927797668=/root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096051.78399: variable 'ansible_module_compression' from source: unknown 7530 1727096051.78402: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096051.78405: variable 'ansible_facts' from source: unknown 7530 1727096051.78462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/AnsiballZ_command.py 7530 1727096051.78835: Sending initial data 7530 1727096051.78839: Sent initial data (154 bytes) 7530 1727096051.79531: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096051.79539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096051.79561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096051.79629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096051.79635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096051.79637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096051.79683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096051.79695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096051.79745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096051.81446: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096051.81538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096051.81544: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpugy4cxev /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/AnsiballZ_command.py <<< 7530 1727096051.81613: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/AnsiballZ_command.py" <<< 7530 1727096051.81618: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpugy4cxev" to remote "/root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/AnsiballZ_command.py" <<< 7530 1727096051.82876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096051.82880: stdout chunk (state=3): >>><<< 7530 1727096051.82882: stderr chunk (state=3): >>><<< 7530 1727096051.82884: done transferring module to remote 7530 1727096051.82886: _low_level_execute_command(): starting 7530 1727096051.82889: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/ /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/AnsiballZ_command.py && sleep 0' 7530 1727096051.83997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096051.84028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096051.84050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096051.84054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096051.84121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096051.86508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096051.86512: stdout chunk (state=3): >>><<< 7530 1727096051.86514: stderr chunk (state=3): >>><<< 7530 1727096051.86517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096051.86519: _low_level_execute_command(): starting 7530 1727096051.86522: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/AnsiballZ_command.py && sleep 0' 7530 1727096051.87710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096051.87736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096051.87754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096051.87777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096051.87795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096051.87830: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096051.87928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096051.88242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096051.88308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.05694: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-23 08:54:12.050410", "end": "2024-09-23 08:54:12.054185", "delta": "0:00:00.003775", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096052.07858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096052.07878: stdout chunk (state=3): >>><<< 7530 1727096052.07903: stderr chunk (state=3): >>><<< 7530 1727096052.07928: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-23 08:54:12.050410", "end": "2024-09-23 08:54:12.054185", "delta": "0:00:00.003775", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096052.07974: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096052.07994: _low_level_execute_command(): starting 7530 1727096052.08003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096051.7481544-9097-161107927797668/ > /dev/null 2>&1 && sleep 0' 7530 1727096052.08675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096052.08691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096052.08707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096052.08726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096052.08846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096052.08874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.08944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.10905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096052.10990: stderr chunk (state=3): >>><<< 7530 1727096052.10994: stdout chunk (state=3): >>><<< 7530 1727096052.11174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096052.11178: handler run complete 7530 1727096052.11180: Evaluated conditional (False): False 7530 1727096052.11183: attempt loop complete, returning result 7530 1727096052.11185: _execute() done 7530 1727096052.11187: dumping result to json 7530 1727096052.11189: done dumping result, returning 7530 1727096052.11192: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [0afff68d-5257-086b-f4f0-0000000000ff] 7530 1727096052.11194: sending task result for task 0afff68d-5257-086b-f4f0-0000000000ff 7530 1727096052.11277: done sending task result for task 0afff68d-5257-086b-f4f0-0000000000ff 7530 1727096052.11281: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003775", "end": "2024-09-23 08:54:12.054185", "rc": 0, "start": "2024-09-23 08:54:12.050410" } STDOUT: default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 7530 1727096052.11364: no more pending results, returning what we have 7530 1727096052.11371: results queue empty 7530 1727096052.11372: checking for any_errors_fatal 7530 1727096052.11380: done checking for any_errors_fatal 7530 1727096052.11381: checking for max_fail_percentage 7530 1727096052.11383: done checking for max_fail_percentage 7530 1727096052.11384: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.11385: done checking to see if all hosts have failed 7530 1727096052.11385: getting the remaining hosts for this loop 7530 1727096052.11387: done getting the remaining hosts for this loop 7530 1727096052.11391: getting the next task for host managed_node3 7530 1727096052.11398: done getting next task for host managed_node3 7530 1727096052.11401: ^ task is: TASK: Assert default ipv4 route is absent 7530 1727096052.11403: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.11409: getting variables 7530 1727096052.11411: in VariableManager get_vars() 7530 1727096052.11591: Calling all_inventory to load vars for managed_node3 7530 1727096052.11594: Calling groups_inventory to load vars for managed_node3 7530 1727096052.11598: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.11610: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.11614: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.11617: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.13342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.15359: done with get_vars() 7530 1727096052.15399: done getting variables 7530 1727096052.15464: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is absent] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:118 Monday 23 September 2024 08:54:12 -0400 (0:00:00.461) 0:00:42.943 ****** 7530 1727096052.15497: entering _queue_task() for managed_node3/assert 7530 1727096052.15882: worker is 1 (out of 1 available) 7530 1727096052.15896: exiting _queue_task() for managed_node3/assert 7530 1727096052.15910: done queuing things up, now waiting for results queue to drain 7530 1727096052.15912: waiting for pending results... 7530 1727096052.16291: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent 7530 1727096052.16297: in run() - task 0afff68d-5257-086b-f4f0-000000000100 7530 1727096052.16301: variable 'ansible_search_path' from source: unknown 7530 1727096052.16314: calling self._execute() 7530 1727096052.16430: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.16444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.16460: variable 'omit' from source: magic vars 7530 1727096052.16844: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.16863: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.16877: variable 'omit' from source: magic vars 7530 1727096052.16904: variable 'omit' from source: magic vars 7530 1727096052.16947: variable 'omit' from source: magic vars 7530 1727096052.16996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096052.17038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096052.17063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096052.17086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.17100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.17134: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096052.17144: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.17151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.17255: Set connection var ansible_pipelining to False 7530 1727096052.17472: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096052.17476: Set connection var ansible_timeout to 10 7530 1727096052.17478: Set connection var ansible_shell_executable to /bin/sh 7530 1727096052.17480: Set connection var ansible_shell_type to sh 7530 1727096052.17482: Set connection var ansible_connection to ssh 7530 1727096052.17484: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.17486: variable 'ansible_connection' from source: unknown 7530 1727096052.17488: variable 'ansible_module_compression' from source: unknown 7530 1727096052.17490: variable 'ansible_shell_type' from source: unknown 7530 1727096052.17492: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.17494: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.17496: variable 'ansible_pipelining' from source: unknown 7530 1727096052.17497: variable 'ansible_timeout' from source: unknown 7530 1727096052.17499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.17503: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096052.17521: variable 'omit' from source: magic vars 7530 1727096052.17529: starting attempt loop 7530 1727096052.17536: running the handler 7530 1727096052.17710: variable '__test_str' from source: task vars 7530 1727096052.17793: variable 'interface' from source: play vars 7530 1727096052.17809: variable 'ipv4_routes' from source: set_fact 7530 1727096052.17826: Evaluated conditional (__test_str not in ipv4_routes.stdout): True 7530 1727096052.17836: handler run complete 7530 1727096052.17855: attempt loop complete, returning result 7530 1727096052.17862: _execute() done 7530 1727096052.17870: dumping result to json 7530 1727096052.17877: done dumping result, returning 7530 1727096052.17891: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent [0afff68d-5257-086b-f4f0-000000000100] 7530 1727096052.17903: sending task result for task 0afff68d-5257-086b-f4f0-000000000100 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096052.18055: no more pending results, returning what we have 7530 1727096052.18058: results queue empty 7530 1727096052.18059: checking for any_errors_fatal 7530 1727096052.18072: done checking for any_errors_fatal 7530 1727096052.18073: checking for max_fail_percentage 7530 1727096052.18074: done checking for max_fail_percentage 7530 1727096052.18075: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.18076: done checking to see if all hosts have failed 7530 1727096052.18077: getting the remaining hosts for this loop 7530 1727096052.18078: done getting the remaining hosts for this loop 7530 1727096052.18082: getting the next task for host managed_node3 7530 1727096052.18088: done getting next task for host managed_node3 7530 1727096052.18090: ^ task is: TASK: Get ipv6 routes 7530 1727096052.18092: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.18095: getting variables 7530 1727096052.18097: in VariableManager get_vars() 7530 1727096052.18148: Calling all_inventory to load vars for managed_node3 7530 1727096052.18151: Calling groups_inventory to load vars for managed_node3 7530 1727096052.18153: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.18162: done sending task result for task 0afff68d-5257-086b-f4f0-000000000100 7530 1727096052.18166: WORKER PROCESS EXITING 7530 1727096052.18281: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.18285: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.18289: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.19879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.21589: done with get_vars() 7530 1727096052.21622: done getting variables 7530 1727096052.21695: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:123 Monday 23 September 2024 08:54:12 -0400 (0:00:00.062) 0:00:43.005 ****** 7530 1727096052.21727: entering _queue_task() for managed_node3/command 7530 1727096052.22286: worker is 1 (out of 1 available) 7530 1727096052.22297: exiting _queue_task() for managed_node3/command 7530 1727096052.22307: done queuing things up, now waiting for results queue to drain 7530 1727096052.22308: waiting for pending results... 7530 1727096052.22420: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7530 1727096052.22625: in run() - task 0afff68d-5257-086b-f4f0-000000000101 7530 1727096052.22630: variable 'ansible_search_path' from source: unknown 7530 1727096052.22634: calling self._execute() 7530 1727096052.22712: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.22717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.22730: variable 'omit' from source: magic vars 7530 1727096052.23200: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.23214: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.23221: variable 'omit' from source: magic vars 7530 1727096052.23249: variable 'omit' from source: magic vars 7530 1727096052.23291: variable 'omit' from source: magic vars 7530 1727096052.23344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096052.23388: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096052.23417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096052.23434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.23476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.23480: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096052.23489: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.23492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.23607: Set connection var ansible_pipelining to False 7530 1727096052.23611: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096052.23673: Set connection var ansible_timeout to 10 7530 1727096052.23676: Set connection var ansible_shell_executable to /bin/sh 7530 1727096052.23679: Set connection var ansible_shell_type to sh 7530 1727096052.23682: Set connection var ansible_connection to ssh 7530 1727096052.23684: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.23686: variable 'ansible_connection' from source: unknown 7530 1727096052.23689: variable 'ansible_module_compression' from source: unknown 7530 1727096052.23692: variable 'ansible_shell_type' from source: unknown 7530 1727096052.23695: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.23696: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.23699: variable 'ansible_pipelining' from source: unknown 7530 1727096052.23703: variable 'ansible_timeout' from source: unknown 7530 1727096052.23706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.23899: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096052.23903: variable 'omit' from source: magic vars 7530 1727096052.23906: starting attempt loop 7530 1727096052.23908: running the handler 7530 1727096052.23910: _low_level_execute_command(): starting 7530 1727096052.23912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096052.24760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096052.24765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096052.24770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096052.24774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096052.24776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096052.24831: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.24872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096052.24888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096052.24914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.24982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.26684: stdout chunk (state=3): >>>/root <<< 7530 1727096052.26794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096052.26809: stderr chunk (state=3): >>><<< 7530 1727096052.26812: stdout chunk (state=3): >>><<< 7530 1727096052.26839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096052.26850: _low_level_execute_command(): starting 7530 1727096052.26857: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640 `" && echo ansible-tmp-1727096052.2683651-9125-61365474038640="` echo /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640 `" ) && sleep 0' 7530 1727096052.27326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096052.27330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.27341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096052.27344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.27395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096052.27398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096052.27400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.27448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.29481: stdout chunk (state=3): >>>ansible-tmp-1727096052.2683651-9125-61365474038640=/root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640 <<< 7530 1727096052.29588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096052.29617: stderr chunk (state=3): >>><<< 7530 1727096052.29620: stdout chunk (state=3): >>><<< 7530 1727096052.29640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096052.2683651-9125-61365474038640=/root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096052.29673: variable 'ansible_module_compression' from source: unknown 7530 1727096052.29716: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096052.29750: variable 'ansible_facts' from source: unknown 7530 1727096052.29810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/AnsiballZ_command.py 7530 1727096052.29922: Sending initial data 7530 1727096052.29925: Sent initial data (153 bytes) 7530 1727096052.30400: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096052.30406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096052.30409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096052.30413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.30462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096052.30466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096052.30470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.30513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.32239: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096052.32262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096052.32294: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyqujd3mg /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/AnsiballZ_command.py <<< 7530 1727096052.32302: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/AnsiballZ_command.py" <<< 7530 1727096052.32331: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyqujd3mg" to remote "/root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/AnsiballZ_command.py" <<< 7530 1727096052.32850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096052.32901: stderr chunk (state=3): >>><<< 7530 1727096052.32905: stdout chunk (state=3): >>><<< 7530 1727096052.32935: done transferring module to remote 7530 1727096052.32947: _low_level_execute_command(): starting 7530 1727096052.32951: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/ /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/AnsiballZ_command.py && sleep 0' 7530 1727096052.33504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096052.33510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096052.33516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.33580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.33607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.35521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096052.35550: stderr chunk (state=3): >>><<< 7530 1727096052.35554: stdout chunk (state=3): >>><<< 7530 1727096052.35573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096052.35578: _low_level_execute_command(): starting 7530 1727096052.35581: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/AnsiballZ_command.py && sleep 0' 7530 1727096052.36053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096052.36057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.36060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096052.36062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.36114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096052.36117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096052.36124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.36165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.53090: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 08:54:12.524578", "end": "2024-09-23 08:54:12.528362", "delta": "0:00:00.003784", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096052.54834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096052.54870: stderr chunk (state=3): >>><<< 7530 1727096052.54874: stdout chunk (state=3): >>><<< 7530 1727096052.54891: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-23 08:54:12.524578", "end": "2024-09-23 08:54:12.528362", "delta": "0:00:00.003784", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096052.54920: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096052.54927: _low_level_execute_command(): starting 7530 1727096052.54932: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096052.2683651-9125-61365474038640/ > /dev/null 2>&1 && sleep 0' 7530 1727096052.55413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096052.55425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096052.55428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.55479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096052.55482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096052.55484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.55530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096052.57555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096052.57559: stdout chunk (state=3): >>><<< 7530 1727096052.57561: stderr chunk (state=3): >>><<< 7530 1727096052.57586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096052.57776: handler run complete 7530 1727096052.57780: Evaluated conditional (False): False 7530 1727096052.57782: attempt loop complete, returning result 7530 1727096052.57784: _execute() done 7530 1727096052.57786: dumping result to json 7530 1727096052.57788: done dumping result, returning 7530 1727096052.57790: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [0afff68d-5257-086b-f4f0-000000000101] 7530 1727096052.57792: sending task result for task 0afff68d-5257-086b-f4f0-000000000101 7530 1727096052.57869: done sending task result for task 0afff68d-5257-086b-f4f0-000000000101 7530 1727096052.57873: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003784", "end": "2024-09-23 08:54:12.528362", "rc": 0, "start": "2024-09-23 08:54:12.524578" } STDOUT: 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium 7530 1727096052.57952: no more pending results, returning what we have 7530 1727096052.57955: results queue empty 7530 1727096052.57956: checking for any_errors_fatal 7530 1727096052.57966: done checking for any_errors_fatal 7530 1727096052.57969: checking for max_fail_percentage 7530 1727096052.57971: done checking for max_fail_percentage 7530 1727096052.57972: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.57973: done checking to see if all hosts have failed 7530 1727096052.57974: getting the remaining hosts for this loop 7530 1727096052.57975: done getting the remaining hosts for this loop 7530 1727096052.57979: getting the next task for host managed_node3 7530 1727096052.57985: done getting next task for host managed_node3 7530 1727096052.57988: ^ task is: TASK: Assert default ipv6 route is absent 7530 1727096052.57990: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.57994: getting variables 7530 1727096052.57996: in VariableManager get_vars() 7530 1727096052.58048: Calling all_inventory to load vars for managed_node3 7530 1727096052.58051: Calling groups_inventory to load vars for managed_node3 7530 1727096052.58053: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.58066: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.58191: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.58197: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.59745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.61422: done with get_vars() 7530 1727096052.61461: done getting variables 7530 1727096052.61524: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is absent] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:127 Monday 23 September 2024 08:54:12 -0400 (0:00:00.398) 0:00:43.403 ****** 7530 1727096052.61559: entering _queue_task() for managed_node3/assert 7530 1727096052.61922: worker is 1 (out of 1 available) 7530 1727096052.61936: exiting _queue_task() for managed_node3/assert 7530 1727096052.61947: done queuing things up, now waiting for results queue to drain 7530 1727096052.61949: waiting for pending results... 7530 1727096052.62324: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent 7530 1727096052.62418: in run() - task 0afff68d-5257-086b-f4f0-000000000102 7530 1727096052.62423: variable 'ansible_search_path' from source: unknown 7530 1727096052.62463: calling self._execute() 7530 1727096052.62636: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.62640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.62644: variable 'omit' from source: magic vars 7530 1727096052.63018: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.63041: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.63296: variable 'network_provider' from source: set_fact 7530 1727096052.63301: Evaluated conditional (network_provider == "nm"): True 7530 1727096052.63305: variable 'omit' from source: magic vars 7530 1727096052.63307: variable 'omit' from source: magic vars 7530 1727096052.63310: variable 'omit' from source: magic vars 7530 1727096052.63350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096052.63403: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096052.63435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096052.63459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.63484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.63533: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096052.63544: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.63552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.63677: Set connection var ansible_pipelining to False 7530 1727096052.63690: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096052.63727: Set connection var ansible_timeout to 10 7530 1727096052.63731: Set connection var ansible_shell_executable to /bin/sh 7530 1727096052.63733: Set connection var ansible_shell_type to sh 7530 1727096052.63735: Set connection var ansible_connection to ssh 7530 1727096052.63763: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.63775: variable 'ansible_connection' from source: unknown 7530 1727096052.63836: variable 'ansible_module_compression' from source: unknown 7530 1727096052.63839: variable 'ansible_shell_type' from source: unknown 7530 1727096052.63842: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.63844: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.63846: variable 'ansible_pipelining' from source: unknown 7530 1727096052.63848: variable 'ansible_timeout' from source: unknown 7530 1727096052.63850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.63988: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096052.64008: variable 'omit' from source: magic vars 7530 1727096052.64019: starting attempt loop 7530 1727096052.64027: running the handler 7530 1727096052.64214: variable '__test_str' from source: task vars 7530 1727096052.64300: variable 'interface' from source: play vars 7530 1727096052.64375: variable 'ipv6_route' from source: set_fact 7530 1727096052.64378: Evaluated conditional (__test_str not in ipv6_route.stdout): True 7530 1727096052.64380: handler run complete 7530 1727096052.64383: attempt loop complete, returning result 7530 1727096052.64385: _execute() done 7530 1727096052.64387: dumping result to json 7530 1727096052.64389: done dumping result, returning 7530 1727096052.64391: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent [0afff68d-5257-086b-f4f0-000000000102] 7530 1727096052.64394: sending task result for task 0afff68d-5257-086b-f4f0-000000000102 7530 1727096052.64706: done sending task result for task 0afff68d-5257-086b-f4f0-000000000102 7530 1727096052.64710: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7530 1727096052.64763: no more pending results, returning what we have 7530 1727096052.64769: results queue empty 7530 1727096052.64770: checking for any_errors_fatal 7530 1727096052.64781: done checking for any_errors_fatal 7530 1727096052.64782: checking for max_fail_percentage 7530 1727096052.64784: done checking for max_fail_percentage 7530 1727096052.64785: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.64787: done checking to see if all hosts have failed 7530 1727096052.64787: getting the remaining hosts for this loop 7530 1727096052.64789: done getting the remaining hosts for this loop 7530 1727096052.64793: getting the next task for host managed_node3 7530 1727096052.64800: done getting next task for host managed_node3 7530 1727096052.64803: ^ task is: TASK: TEARDOWN: remove profiles. 7530 1727096052.64805: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.64810: getting variables 7530 1727096052.64812: in VariableManager get_vars() 7530 1727096052.64875: Calling all_inventory to load vars for managed_node3 7530 1727096052.64878: Calling groups_inventory to load vars for managed_node3 7530 1727096052.64881: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.64895: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.64898: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.64901: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.66622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.68240: done with get_vars() 7530 1727096052.68275: done getting variables 7530 1727096052.68340: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:133 Monday 23 September 2024 08:54:12 -0400 (0:00:00.068) 0:00:43.472 ****** 7530 1727096052.68374: entering _queue_task() for managed_node3/debug 7530 1727096052.68866: worker is 1 (out of 1 available) 7530 1727096052.68884: exiting _queue_task() for managed_node3/debug 7530 1727096052.68896: done queuing things up, now waiting for results queue to drain 7530 1727096052.68898: waiting for pending results... 7530 1727096052.69113: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7530 1727096052.69276: in run() - task 0afff68d-5257-086b-f4f0-000000000103 7530 1727096052.69281: variable 'ansible_search_path' from source: unknown 7530 1727096052.69302: calling self._execute() 7530 1727096052.69427: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.69431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.69442: variable 'omit' from source: magic vars 7530 1727096052.69754: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.69765: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.69772: variable 'omit' from source: magic vars 7530 1727096052.69789: variable 'omit' from source: magic vars 7530 1727096052.69821: variable 'omit' from source: magic vars 7530 1727096052.69854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096052.69883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096052.69900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096052.69913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.69925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.69951: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096052.69955: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.69957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.70031: Set connection var ansible_pipelining to False 7530 1727096052.70040: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096052.70042: Set connection var ansible_timeout to 10 7530 1727096052.70053: Set connection var ansible_shell_executable to /bin/sh 7530 1727096052.70056: Set connection var ansible_shell_type to sh 7530 1727096052.70058: Set connection var ansible_connection to ssh 7530 1727096052.70079: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.70082: variable 'ansible_connection' from source: unknown 7530 1727096052.70085: variable 'ansible_module_compression' from source: unknown 7530 1727096052.70087: variable 'ansible_shell_type' from source: unknown 7530 1727096052.70090: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.70092: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.70094: variable 'ansible_pipelining' from source: unknown 7530 1727096052.70096: variable 'ansible_timeout' from source: unknown 7530 1727096052.70099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.70208: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096052.70220: variable 'omit' from source: magic vars 7530 1727096052.70224: starting attempt loop 7530 1727096052.70227: running the handler 7530 1727096052.70271: handler run complete 7530 1727096052.70284: attempt loop complete, returning result 7530 1727096052.70287: _execute() done 7530 1727096052.70289: dumping result to json 7530 1727096052.70291: done dumping result, returning 7530 1727096052.70298: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [0afff68d-5257-086b-f4f0-000000000103] 7530 1727096052.70302: sending task result for task 0afff68d-5257-086b-f4f0-000000000103 7530 1727096052.70391: done sending task result for task 0afff68d-5257-086b-f4f0-000000000103 7530 1727096052.70394: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7530 1727096052.70445: no more pending results, returning what we have 7530 1727096052.70448: results queue empty 7530 1727096052.70449: checking for any_errors_fatal 7530 1727096052.70457: done checking for any_errors_fatal 7530 1727096052.70458: checking for max_fail_percentage 7530 1727096052.70460: done checking for max_fail_percentage 7530 1727096052.70461: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.70462: done checking to see if all hosts have failed 7530 1727096052.70462: getting the remaining hosts for this loop 7530 1727096052.70464: done getting the remaining hosts for this loop 7530 1727096052.70469: getting the next task for host managed_node3 7530 1727096052.70478: done getting next task for host managed_node3 7530 1727096052.70483: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096052.70486: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.70511: getting variables 7530 1727096052.70513: in VariableManager get_vars() 7530 1727096052.70561: Calling all_inventory to load vars for managed_node3 7530 1727096052.70563: Calling groups_inventory to load vars for managed_node3 7530 1727096052.70566: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.70584: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.70586: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.70589: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.71593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.72952: done with get_vars() 7530 1727096052.72977: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:54:12 -0400 (0:00:00.046) 0:00:43.518 ****** 7530 1727096052.73056: entering _queue_task() for managed_node3/include_tasks 7530 1727096052.73326: worker is 1 (out of 1 available) 7530 1727096052.73344: exiting _queue_task() for managed_node3/include_tasks 7530 1727096052.73358: done queuing things up, now waiting for results queue to drain 7530 1727096052.73360: waiting for pending results... 7530 1727096052.73552: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7530 1727096052.73660: in run() - task 0afff68d-5257-086b-f4f0-00000000010b 7530 1727096052.73674: variable 'ansible_search_path' from source: unknown 7530 1727096052.73678: variable 'ansible_search_path' from source: unknown 7530 1727096052.73710: calling self._execute() 7530 1727096052.73791: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.73796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.73811: variable 'omit' from source: magic vars 7530 1727096052.74104: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.74114: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.74121: _execute() done 7530 1727096052.74125: dumping result to json 7530 1727096052.74128: done dumping result, returning 7530 1727096052.74143: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-086b-f4f0-00000000010b] 7530 1727096052.74146: sending task result for task 0afff68d-5257-086b-f4f0-00000000010b 7530 1727096052.74240: done sending task result for task 0afff68d-5257-086b-f4f0-00000000010b 7530 1727096052.74243: WORKER PROCESS EXITING 7530 1727096052.74295: no more pending results, returning what we have 7530 1727096052.74300: in VariableManager get_vars() 7530 1727096052.74363: Calling all_inventory to load vars for managed_node3 7530 1727096052.74366: Calling groups_inventory to load vars for managed_node3 7530 1727096052.74377: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.74389: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.74392: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.74394: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.75854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.76820: done with get_vars() 7530 1727096052.76846: variable 'ansible_search_path' from source: unknown 7530 1727096052.76847: variable 'ansible_search_path' from source: unknown 7530 1727096052.76881: we have included files to process 7530 1727096052.76882: generating all_blocks data 7530 1727096052.76884: done generating all_blocks data 7530 1727096052.76888: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096052.76889: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096052.76891: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7530 1727096052.77298: done processing included file 7530 1727096052.77300: iterating over new_blocks loaded from include file 7530 1727096052.77301: in VariableManager get_vars() 7530 1727096052.77326: done with get_vars() 7530 1727096052.77328: filtering new block on tags 7530 1727096052.77343: done filtering new block on tags 7530 1727096052.77345: in VariableManager get_vars() 7530 1727096052.77363: done with get_vars() 7530 1727096052.77364: filtering new block on tags 7530 1727096052.77380: done filtering new block on tags 7530 1727096052.77381: in VariableManager get_vars() 7530 1727096052.77399: done with get_vars() 7530 1727096052.77400: filtering new block on tags 7530 1727096052.77411: done filtering new block on tags 7530 1727096052.77413: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7530 1727096052.77418: extending task lists for all hosts with included blocks 7530 1727096052.78099: done extending task lists 7530 1727096052.78101: done processing included files 7530 1727096052.78102: results queue empty 7530 1727096052.78102: checking for any_errors_fatal 7530 1727096052.78106: done checking for any_errors_fatal 7530 1727096052.78107: checking for max_fail_percentage 7530 1727096052.78108: done checking for max_fail_percentage 7530 1727096052.78109: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.78110: done checking to see if all hosts have failed 7530 1727096052.78111: getting the remaining hosts for this loop 7530 1727096052.78112: done getting the remaining hosts for this loop 7530 1727096052.78115: getting the next task for host managed_node3 7530 1727096052.78122: done getting next task for host managed_node3 7530 1727096052.78125: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096052.78129: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.78143: getting variables 7530 1727096052.78144: in VariableManager get_vars() 7530 1727096052.78170: Calling all_inventory to load vars for managed_node3 7530 1727096052.78172: Calling groups_inventory to load vars for managed_node3 7530 1727096052.78174: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.78180: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.78183: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.78185: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.79387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.80446: done with get_vars() 7530 1727096052.80475: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:54:12 -0400 (0:00:00.074) 0:00:43.593 ****** 7530 1727096052.80536: entering _queue_task() for managed_node3/setup 7530 1727096052.80810: worker is 1 (out of 1 available) 7530 1727096052.80824: exiting _queue_task() for managed_node3/setup 7530 1727096052.80839: done queuing things up, now waiting for results queue to drain 7530 1727096052.80841: waiting for pending results... 7530 1727096052.81030: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7530 1727096052.81132: in run() - task 0afff68d-5257-086b-f4f0-0000000019b6 7530 1727096052.81144: variable 'ansible_search_path' from source: unknown 7530 1727096052.81147: variable 'ansible_search_path' from source: unknown 7530 1727096052.81183: calling self._execute() 7530 1727096052.81257: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.81261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.81272: variable 'omit' from source: magic vars 7530 1727096052.81562: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.81579: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.81827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096052.83703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096052.83766: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096052.83796: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096052.83826: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096052.83849: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096052.83913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096052.83934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096052.83956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096052.83984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096052.83995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096052.84035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096052.84055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096052.84074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096052.84098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096052.84109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096052.84248: variable '__network_required_facts' from source: role '' defaults 7530 1727096052.84252: variable 'ansible_facts' from source: unknown 7530 1727096052.84725: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7530 1727096052.84730: when evaluation is False, skipping this task 7530 1727096052.84735: _execute() done 7530 1727096052.84738: dumping result to json 7530 1727096052.84741: done dumping result, returning 7530 1727096052.84744: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-086b-f4f0-0000000019b6] 7530 1727096052.84747: sending task result for task 0afff68d-5257-086b-f4f0-0000000019b6 7530 1727096052.84838: done sending task result for task 0afff68d-5257-086b-f4f0-0000000019b6 7530 1727096052.84841: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096052.84892: no more pending results, returning what we have 7530 1727096052.84896: results queue empty 7530 1727096052.84897: checking for any_errors_fatal 7530 1727096052.84898: done checking for any_errors_fatal 7530 1727096052.84899: checking for max_fail_percentage 7530 1727096052.84900: done checking for max_fail_percentage 7530 1727096052.84901: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.84902: done checking to see if all hosts have failed 7530 1727096052.84903: getting the remaining hosts for this loop 7530 1727096052.84904: done getting the remaining hosts for this loop 7530 1727096052.84908: getting the next task for host managed_node3 7530 1727096052.84916: done getting next task for host managed_node3 7530 1727096052.84919: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096052.84923: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.84946: getting variables 7530 1727096052.84948: in VariableManager get_vars() 7530 1727096052.85006: Calling all_inventory to load vars for managed_node3 7530 1727096052.85010: Calling groups_inventory to load vars for managed_node3 7530 1727096052.85012: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.85022: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.85024: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.85027: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.85867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.86760: done with get_vars() 7530 1727096052.86787: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:54:12 -0400 (0:00:00.063) 0:00:43.657 ****** 7530 1727096052.86873: entering _queue_task() for managed_node3/stat 7530 1727096052.87151: worker is 1 (out of 1 available) 7530 1727096052.87164: exiting _queue_task() for managed_node3/stat 7530 1727096052.87180: done queuing things up, now waiting for results queue to drain 7530 1727096052.87182: waiting for pending results... 7530 1727096052.87374: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7530 1727096052.87479: in run() - task 0afff68d-5257-086b-f4f0-0000000019b8 7530 1727096052.87492: variable 'ansible_search_path' from source: unknown 7530 1727096052.87496: variable 'ansible_search_path' from source: unknown 7530 1727096052.87527: calling self._execute() 7530 1727096052.87604: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.87611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.87621: variable 'omit' from source: magic vars 7530 1727096052.87912: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.87923: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.88047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096052.88251: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096052.88291: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096052.88316: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096052.88341: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096052.88408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096052.88426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096052.88445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096052.88462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096052.88536: variable '__network_is_ostree' from source: set_fact 7530 1727096052.88540: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096052.88543: when evaluation is False, skipping this task 7530 1727096052.88546: _execute() done 7530 1727096052.88548: dumping result to json 7530 1727096052.88550: done dumping result, returning 7530 1727096052.88559: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-086b-f4f0-0000000019b8] 7530 1727096052.88564: sending task result for task 0afff68d-5257-086b-f4f0-0000000019b8 7530 1727096052.88651: done sending task result for task 0afff68d-5257-086b-f4f0-0000000019b8 7530 1727096052.88653: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096052.88708: no more pending results, returning what we have 7530 1727096052.88712: results queue empty 7530 1727096052.88713: checking for any_errors_fatal 7530 1727096052.88720: done checking for any_errors_fatal 7530 1727096052.88720: checking for max_fail_percentage 7530 1727096052.88722: done checking for max_fail_percentage 7530 1727096052.88723: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.88724: done checking to see if all hosts have failed 7530 1727096052.88725: getting the remaining hosts for this loop 7530 1727096052.88726: done getting the remaining hosts for this loop 7530 1727096052.88730: getting the next task for host managed_node3 7530 1727096052.88739: done getting next task for host managed_node3 7530 1727096052.88742: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096052.88746: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.88770: getting variables 7530 1727096052.88772: in VariableManager get_vars() 7530 1727096052.88818: Calling all_inventory to load vars for managed_node3 7530 1727096052.88821: Calling groups_inventory to load vars for managed_node3 7530 1727096052.88823: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.88832: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.88838: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.88841: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.89771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.90663: done with get_vars() 7530 1727096052.90691: done getting variables 7530 1727096052.90742: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:54:12 -0400 (0:00:00.039) 0:00:43.696 ****** 7530 1727096052.90770: entering _queue_task() for managed_node3/set_fact 7530 1727096052.91043: worker is 1 (out of 1 available) 7530 1727096052.91056: exiting _queue_task() for managed_node3/set_fact 7530 1727096052.91074: done queuing things up, now waiting for results queue to drain 7530 1727096052.91076: waiting for pending results... 7530 1727096052.91258: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7530 1727096052.91372: in run() - task 0afff68d-5257-086b-f4f0-0000000019b9 7530 1727096052.91385: variable 'ansible_search_path' from source: unknown 7530 1727096052.91389: variable 'ansible_search_path' from source: unknown 7530 1727096052.91421: calling self._execute() 7530 1727096052.91499: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.91503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.91511: variable 'omit' from source: magic vars 7530 1727096052.91884: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.91976: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.92114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096052.92457: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096052.92530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096052.92641: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096052.92645: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096052.92729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096052.92779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096052.92813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096052.92858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096052.92989: variable '__network_is_ostree' from source: set_fact 7530 1727096052.93004: Evaluated conditional (not __network_is_ostree is defined): False 7530 1727096052.93084: when evaluation is False, skipping this task 7530 1727096052.93087: _execute() done 7530 1727096052.93090: dumping result to json 7530 1727096052.93093: done dumping result, returning 7530 1727096052.93095: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-086b-f4f0-0000000019b9] 7530 1727096052.93098: sending task result for task 0afff68d-5257-086b-f4f0-0000000019b9 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7530 1727096052.93322: no more pending results, returning what we have 7530 1727096052.93326: results queue empty 7530 1727096052.93327: checking for any_errors_fatal 7530 1727096052.93337: done checking for any_errors_fatal 7530 1727096052.93338: checking for max_fail_percentage 7530 1727096052.93339: done checking for max_fail_percentage 7530 1727096052.93341: checking to see if all hosts have failed and the running result is not ok 7530 1727096052.93342: done checking to see if all hosts have failed 7530 1727096052.93343: getting the remaining hosts for this loop 7530 1727096052.93345: done getting the remaining hosts for this loop 7530 1727096052.93349: getting the next task for host managed_node3 7530 1727096052.93361: done getting next task for host managed_node3 7530 1727096052.93365: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096052.93372: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096052.93398: getting variables 7530 1727096052.93401: in VariableManager get_vars() 7530 1727096052.93458: Calling all_inventory to load vars for managed_node3 7530 1727096052.93461: Calling groups_inventory to load vars for managed_node3 7530 1727096052.93465: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096052.93480: done sending task result for task 0afff68d-5257-086b-f4f0-0000000019b9 7530 1727096052.93483: WORKER PROCESS EXITING 7530 1727096052.93680: Calling all_plugins_play to load vars for managed_node3 7530 1727096052.93684: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096052.93696: Calling groups_plugins_play to load vars for managed_node3 7530 1727096052.95312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096052.96948: done with get_vars() 7530 1727096052.96973: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:54:12 -0400 (0:00:00.062) 0:00:43.758 ****** 7530 1727096052.97050: entering _queue_task() for managed_node3/service_facts 7530 1727096052.97317: worker is 1 (out of 1 available) 7530 1727096052.97330: exiting _queue_task() for managed_node3/service_facts 7530 1727096052.97346: done queuing things up, now waiting for results queue to drain 7530 1727096052.97348: waiting for pending results... 7530 1727096052.97542: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7530 1727096052.97645: in run() - task 0afff68d-5257-086b-f4f0-0000000019bb 7530 1727096052.97658: variable 'ansible_search_path' from source: unknown 7530 1727096052.97662: variable 'ansible_search_path' from source: unknown 7530 1727096052.97695: calling self._execute() 7530 1727096052.97774: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.97778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.97795: variable 'omit' from source: magic vars 7530 1727096052.98075: variable 'ansible_distribution_major_version' from source: facts 7530 1727096052.98085: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096052.98091: variable 'omit' from source: magic vars 7530 1727096052.98148: variable 'omit' from source: magic vars 7530 1727096052.98175: variable 'omit' from source: magic vars 7530 1727096052.98210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096052.98241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096052.98258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096052.98273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.98282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096052.98304: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096052.98307: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.98311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.98386: Set connection var ansible_pipelining to False 7530 1727096052.98389: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096052.98395: Set connection var ansible_timeout to 10 7530 1727096052.98402: Set connection var ansible_shell_executable to /bin/sh 7530 1727096052.98405: Set connection var ansible_shell_type to sh 7530 1727096052.98408: Set connection var ansible_connection to ssh 7530 1727096052.98456: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.98459: variable 'ansible_connection' from source: unknown 7530 1727096052.98462: variable 'ansible_module_compression' from source: unknown 7530 1727096052.98464: variable 'ansible_shell_type' from source: unknown 7530 1727096052.98466: variable 'ansible_shell_executable' from source: unknown 7530 1727096052.98479: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096052.98481: variable 'ansible_pipelining' from source: unknown 7530 1727096052.98483: variable 'ansible_timeout' from source: unknown 7530 1727096052.98485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096052.98627: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096052.98638: variable 'omit' from source: magic vars 7530 1727096052.98641: starting attempt loop 7530 1727096052.98644: running the handler 7530 1727096052.98656: _low_level_execute_command(): starting 7530 1727096052.98663: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096052.99201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096052.99206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096052.99209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096052.99255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096052.99258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096052.99323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096053.01075: stdout chunk (state=3): >>>/root <<< 7530 1727096053.01173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096053.01213: stderr chunk (state=3): >>><<< 7530 1727096053.01215: stdout chunk (state=3): >>><<< 7530 1727096053.01235: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096053.01273: _low_level_execute_command(): starting 7530 1727096053.01276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316 `" && echo ansible-tmp-1727096053.0124133-9160-43189387405316="` echo /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316 `" ) && sleep 0' 7530 1727096053.01731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096053.01734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096053.01737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096053.01750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096053.01801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096053.01804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096053.01806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096053.01854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096053.03921: stdout chunk (state=3): >>>ansible-tmp-1727096053.0124133-9160-43189387405316=/root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316 <<< 7530 1727096053.04074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096053.04078: stdout chunk (state=3): >>><<< 7530 1727096053.04273: stderr chunk (state=3): >>><<< 7530 1727096053.04277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096053.0124133-9160-43189387405316=/root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096053.04280: variable 'ansible_module_compression' from source: unknown 7530 1727096053.04282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7530 1727096053.04284: variable 'ansible_facts' from source: unknown 7530 1727096053.04349: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/AnsiballZ_service_facts.py 7530 1727096053.04525: Sending initial data 7530 1727096053.04534: Sent initial data (159 bytes) 7530 1727096053.05172: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096053.05230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096053.05300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096053.05317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096053.05344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096053.05415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096053.07145: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096053.07209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096053.07277: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpcw4tvaqb /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/AnsiballZ_service_facts.py <<< 7530 1727096053.07281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/AnsiballZ_service_facts.py" <<< 7530 1727096053.07318: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpcw4tvaqb" to remote "/root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/AnsiballZ_service_facts.py" <<< 7530 1727096053.08189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096053.08193: stdout chunk (state=3): >>><<< 7530 1727096053.08195: stderr chunk (state=3): >>><<< 7530 1727096053.08204: done transferring module to remote 7530 1727096053.08218: _low_level_execute_command(): starting 7530 1727096053.08226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/ /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/AnsiballZ_service_facts.py && sleep 0' 7530 1727096053.08853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096053.08874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096053.08890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096053.08908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096053.08925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096053.08957: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096053.09057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096053.09087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096053.09103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096053.09183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096053.11105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096053.11250: stderr chunk (state=3): >>><<< 7530 1727096053.11253: stdout chunk (state=3): >>><<< 7530 1727096053.11256: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096053.11258: _low_level_execute_command(): starting 7530 1727096053.11261: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/AnsiballZ_service_facts.py && sleep 0' 7530 1727096053.11857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096053.11922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096053.11989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096053.12005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096053.12032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096053.12118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096054.78537: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 7530 1727096054.78600: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7530 1727096054.80281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096054.80373: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096054.80376: stdout chunk (state=3): >>><<< 7530 1727096054.80378: stderr chunk (state=3): >>><<< 7530 1727096054.80405: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096054.81277: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096054.81282: _low_level_execute_command(): starting 7530 1727096054.81284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096053.0124133-9160-43189387405316/ > /dev/null 2>&1 && sleep 0' 7530 1727096054.81912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096054.81931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096054.82070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096054.82074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096054.82096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096054.82113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096054.82194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096054.84175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096054.84187: stdout chunk (state=3): >>><<< 7530 1727096054.84195: stderr chunk (state=3): >>><<< 7530 1727096054.84216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096054.84273: handler run complete 7530 1727096054.84437: variable 'ansible_facts' from source: unknown 7530 1727096054.84593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096054.85106: variable 'ansible_facts' from source: unknown 7530 1727096054.85239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096054.85440: attempt loop complete, returning result 7530 1727096054.85444: _execute() done 7530 1727096054.85446: dumping result to json 7530 1727096054.85673: done dumping result, returning 7530 1727096054.85677: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-086b-f4f0-0000000019bb] 7530 1727096054.85679: sending task result for task 0afff68d-5257-086b-f4f0-0000000019bb 7530 1727096054.86336: done sending task result for task 0afff68d-5257-086b-f4f0-0000000019bb 7530 1727096054.86340: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096054.86422: no more pending results, returning what we have 7530 1727096054.86425: results queue empty 7530 1727096054.86426: checking for any_errors_fatal 7530 1727096054.86430: done checking for any_errors_fatal 7530 1727096054.86431: checking for max_fail_percentage 7530 1727096054.86433: done checking for max_fail_percentage 7530 1727096054.86434: checking to see if all hosts have failed and the running result is not ok 7530 1727096054.86434: done checking to see if all hosts have failed 7530 1727096054.86435: getting the remaining hosts for this loop 7530 1727096054.86436: done getting the remaining hosts for this loop 7530 1727096054.86440: getting the next task for host managed_node3 7530 1727096054.86446: done getting next task for host managed_node3 7530 1727096054.86450: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096054.86454: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096054.86465: getting variables 7530 1727096054.86469: in VariableManager get_vars() 7530 1727096054.86509: Calling all_inventory to load vars for managed_node3 7530 1727096054.86512: Calling groups_inventory to load vars for managed_node3 7530 1727096054.86514: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096054.86523: Calling all_plugins_play to load vars for managed_node3 7530 1727096054.86526: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096054.86529: Calling groups_plugins_play to load vars for managed_node3 7530 1727096054.88289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096054.89902: done with get_vars() 7530 1727096054.89937: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:54:14 -0400 (0:00:01.929) 0:00:45.688 ****** 7530 1727096054.90040: entering _queue_task() for managed_node3/package_facts 7530 1727096054.90393: worker is 1 (out of 1 available) 7530 1727096054.90405: exiting _queue_task() for managed_node3/package_facts 7530 1727096054.90418: done queuing things up, now waiting for results queue to drain 7530 1727096054.90420: waiting for pending results... 7530 1727096054.90890: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7530 1727096054.90900: in run() - task 0afff68d-5257-086b-f4f0-0000000019bc 7530 1727096054.90921: variable 'ansible_search_path' from source: unknown 7530 1727096054.90929: variable 'ansible_search_path' from source: unknown 7530 1727096054.90976: calling self._execute() 7530 1727096054.91095: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096054.91109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096054.91126: variable 'omit' from source: magic vars 7530 1727096054.91672: variable 'ansible_distribution_major_version' from source: facts 7530 1727096054.91677: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096054.91680: variable 'omit' from source: magic vars 7530 1727096054.91682: variable 'omit' from source: magic vars 7530 1727096054.91685: variable 'omit' from source: magic vars 7530 1727096054.91740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096054.91785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096054.91819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096054.91846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096054.91863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096054.91906: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096054.91919: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096054.91927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096054.92046: Set connection var ansible_pipelining to False 7530 1727096054.92059: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096054.92074: Set connection var ansible_timeout to 10 7530 1727096054.92090: Set connection var ansible_shell_executable to /bin/sh 7530 1727096054.92098: Set connection var ansible_shell_type to sh 7530 1727096054.92105: Set connection var ansible_connection to ssh 7530 1727096054.92173: variable 'ansible_shell_executable' from source: unknown 7530 1727096054.92176: variable 'ansible_connection' from source: unknown 7530 1727096054.92178: variable 'ansible_module_compression' from source: unknown 7530 1727096054.92181: variable 'ansible_shell_type' from source: unknown 7530 1727096054.92183: variable 'ansible_shell_executable' from source: unknown 7530 1727096054.92185: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096054.92187: variable 'ansible_pipelining' from source: unknown 7530 1727096054.92189: variable 'ansible_timeout' from source: unknown 7530 1727096054.92191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096054.92419: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096054.92461: variable 'omit' from source: magic vars 7530 1727096054.92464: starting attempt loop 7530 1727096054.92467: running the handler 7530 1727096054.92482: _low_level_execute_command(): starting 7530 1727096054.92571: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096054.93352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096054.93383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096054.93402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096054.93437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096054.93500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096054.95230: stdout chunk (state=3): >>>/root <<< 7530 1727096054.95387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096054.95392: stdout chunk (state=3): >>><<< 7530 1727096054.95395: stderr chunk (state=3): >>><<< 7530 1727096054.95526: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096054.95531: _low_level_execute_command(): starting 7530 1727096054.95534: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012 `" && echo ansible-tmp-1727096054.9542377-9202-36451427453012="` echo /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012 `" ) && sleep 0' 7530 1727096054.96126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096054.96143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096054.96158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096054.96228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096054.96297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096054.96323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096054.96372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096054.96417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096054.98423: stdout chunk (state=3): >>>ansible-tmp-1727096054.9542377-9202-36451427453012=/root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012 <<< 7530 1727096054.98579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096054.98583: stdout chunk (state=3): >>><<< 7530 1727096054.98585: stderr chunk (state=3): >>><<< 7530 1727096054.98600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096054.9542377-9202-36451427453012=/root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096054.98654: variable 'ansible_module_compression' from source: unknown 7530 1727096054.98773: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7530 1727096054.98786: variable 'ansible_facts' from source: unknown 7530 1727096054.98998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/AnsiballZ_package_facts.py 7530 1727096054.99259: Sending initial data 7530 1727096054.99262: Sent initial data (159 bytes) 7530 1727096054.99834: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096054.99847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096054.99881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096054.99900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096054.99984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096054.99999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096055.00030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096055.00152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096055.01872: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096055.01884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096055.01926: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpocnc_btf /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/AnsiballZ_package_facts.py <<< 7530 1727096055.01930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/AnsiballZ_package_facts.py" <<< 7530 1727096055.02013: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpocnc_btf" to remote "/root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/AnsiballZ_package_facts.py" <<< 7530 1727096055.03978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096055.03982: stdout chunk (state=3): >>><<< 7530 1727096055.03985: stderr chunk (state=3): >>><<< 7530 1727096055.03990: done transferring module to remote 7530 1727096055.04123: _low_level_execute_command(): starting 7530 1727096055.04127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/ /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/AnsiballZ_package_facts.py && sleep 0' 7530 1727096055.05506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096055.05581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096055.05701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096055.05714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096055.05765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096055.07631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096055.07751: stderr chunk (state=3): >>><<< 7530 1727096055.07757: stdout chunk (state=3): >>><<< 7530 1727096055.07760: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096055.07763: _low_level_execute_command(): starting 7530 1727096055.07765: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/AnsiballZ_package_facts.py && sleep 0' 7530 1727096055.08385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096055.08424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096055.08435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096055.08519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096055.08546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096055.08564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096055.08586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096055.08738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096055.54320: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 7530 1727096055.54348: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 7530 1727096055.54374: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 7530 1727096055.54405: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 7530 1727096055.54428: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 7530 1727096055.54438: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 7530 1727096055.54442: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 7530 1727096055.54468: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 7530 1727096055.54497: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 7530 1727096055.54502: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 7530 1727096055.54508: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 7530 1727096055.54527: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7530 1727096055.56603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096055.56607: stdout chunk (state=3): >>><<< 7530 1727096055.56615: stderr chunk (state=3): >>><<< 7530 1727096055.56652: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096055.57962: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096055.57980: _low_level_execute_command(): starting 7530 1727096055.57984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096054.9542377-9202-36451427453012/ > /dev/null 2>&1 && sleep 0' 7530 1727096055.58625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096055.58682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096055.60546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096055.60575: stderr chunk (state=3): >>><<< 7530 1727096055.60578: stdout chunk (state=3): >>><<< 7530 1727096055.60593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096055.60599: handler run complete 7530 1727096055.61071: variable 'ansible_facts' from source: unknown 7530 1727096055.61337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096055.62776: variable 'ansible_facts' from source: unknown 7530 1727096055.63104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096055.63496: attempt loop complete, returning result 7530 1727096055.63506: _execute() done 7530 1727096055.63509: dumping result to json 7530 1727096055.63637: done dumping result, returning 7530 1727096055.63641: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-086b-f4f0-0000000019bc] 7530 1727096055.63645: sending task result for task 0afff68d-5257-086b-f4f0-0000000019bc 7530 1727096055.65002: done sending task result for task 0afff68d-5257-086b-f4f0-0000000019bc 7530 1727096055.65009: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096055.65099: no more pending results, returning what we have 7530 1727096055.65101: results queue empty 7530 1727096055.65102: checking for any_errors_fatal 7530 1727096055.65106: done checking for any_errors_fatal 7530 1727096055.65107: checking for max_fail_percentage 7530 1727096055.65108: done checking for max_fail_percentage 7530 1727096055.65108: checking to see if all hosts have failed and the running result is not ok 7530 1727096055.65109: done checking to see if all hosts have failed 7530 1727096055.65109: getting the remaining hosts for this loop 7530 1727096055.65110: done getting the remaining hosts for this loop 7530 1727096055.65115: getting the next task for host managed_node3 7530 1727096055.65121: done getting next task for host managed_node3 7530 1727096055.65123: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096055.65126: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096055.65134: getting variables 7530 1727096055.65135: in VariableManager get_vars() 7530 1727096055.65164: Calling all_inventory to load vars for managed_node3 7530 1727096055.65166: Calling groups_inventory to load vars for managed_node3 7530 1727096055.65170: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096055.65176: Calling all_plugins_play to load vars for managed_node3 7530 1727096055.65178: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096055.65180: Calling groups_plugins_play to load vars for managed_node3 7530 1727096055.65870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096055.66814: done with get_vars() 7530 1727096055.66832: done getting variables 7530 1727096055.66882: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:54:15 -0400 (0:00:00.768) 0:00:46.457 ****** 7530 1727096055.66908: entering _queue_task() for managed_node3/debug 7530 1727096055.67164: worker is 1 (out of 1 available) 7530 1727096055.67180: exiting _queue_task() for managed_node3/debug 7530 1727096055.67193: done queuing things up, now waiting for results queue to drain 7530 1727096055.67195: waiting for pending results... 7530 1727096055.67392: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7530 1727096055.67490: in run() - task 0afff68d-5257-086b-f4f0-00000000010c 7530 1727096055.67504: variable 'ansible_search_path' from source: unknown 7530 1727096055.67508: variable 'ansible_search_path' from source: unknown 7530 1727096055.67540: calling self._execute() 7530 1727096055.67619: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096055.67623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096055.67634: variable 'omit' from source: magic vars 7530 1727096055.67925: variable 'ansible_distribution_major_version' from source: facts 7530 1727096055.67935: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096055.67943: variable 'omit' from source: magic vars 7530 1727096055.67987: variable 'omit' from source: magic vars 7530 1727096055.68058: variable 'network_provider' from source: set_fact 7530 1727096055.68077: variable 'omit' from source: magic vars 7530 1727096055.68122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096055.68152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096055.68169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096055.68184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096055.68194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096055.68219: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096055.68222: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096055.68224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096055.68299: Set connection var ansible_pipelining to False 7530 1727096055.68302: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096055.68311: Set connection var ansible_timeout to 10 7530 1727096055.68317: Set connection var ansible_shell_executable to /bin/sh 7530 1727096055.68320: Set connection var ansible_shell_type to sh 7530 1727096055.68322: Set connection var ansible_connection to ssh 7530 1727096055.68345: variable 'ansible_shell_executable' from source: unknown 7530 1727096055.68348: variable 'ansible_connection' from source: unknown 7530 1727096055.68351: variable 'ansible_module_compression' from source: unknown 7530 1727096055.68353: variable 'ansible_shell_type' from source: unknown 7530 1727096055.68355: variable 'ansible_shell_executable' from source: unknown 7530 1727096055.68357: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096055.68360: variable 'ansible_pipelining' from source: unknown 7530 1727096055.68362: variable 'ansible_timeout' from source: unknown 7530 1727096055.68366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096055.68474: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096055.68484: variable 'omit' from source: magic vars 7530 1727096055.68488: starting attempt loop 7530 1727096055.68491: running the handler 7530 1727096055.68529: handler run complete 7530 1727096055.68543: attempt loop complete, returning result 7530 1727096055.68546: _execute() done 7530 1727096055.68548: dumping result to json 7530 1727096055.68551: done dumping result, returning 7530 1727096055.68558: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-086b-f4f0-00000000010c] 7530 1727096055.68563: sending task result for task 0afff68d-5257-086b-f4f0-00000000010c 7530 1727096055.68650: done sending task result for task 0afff68d-5257-086b-f4f0-00000000010c 7530 1727096055.68653: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7530 1727096055.68722: no more pending results, returning what we have 7530 1727096055.68726: results queue empty 7530 1727096055.68727: checking for any_errors_fatal 7530 1727096055.68737: done checking for any_errors_fatal 7530 1727096055.68738: checking for max_fail_percentage 7530 1727096055.68740: done checking for max_fail_percentage 7530 1727096055.68741: checking to see if all hosts have failed and the running result is not ok 7530 1727096055.68742: done checking to see if all hosts have failed 7530 1727096055.68743: getting the remaining hosts for this loop 7530 1727096055.68744: done getting the remaining hosts for this loop 7530 1727096055.68748: getting the next task for host managed_node3 7530 1727096055.68755: done getting next task for host managed_node3 7530 1727096055.68759: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096055.68770: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096055.68783: getting variables 7530 1727096055.68784: in VariableManager get_vars() 7530 1727096055.68829: Calling all_inventory to load vars for managed_node3 7530 1727096055.68832: Calling groups_inventory to load vars for managed_node3 7530 1727096055.68834: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096055.68843: Calling all_plugins_play to load vars for managed_node3 7530 1727096055.68846: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096055.68848: Calling groups_plugins_play to load vars for managed_node3 7530 1727096055.69636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096055.70500: done with get_vars() 7530 1727096055.70525: done getting variables 7530 1727096055.70572: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:54:15 -0400 (0:00:00.036) 0:00:46.494 ****** 7530 1727096055.70600: entering _queue_task() for managed_node3/fail 7530 1727096055.70861: worker is 1 (out of 1 available) 7530 1727096055.70876: exiting _queue_task() for managed_node3/fail 7530 1727096055.70889: done queuing things up, now waiting for results queue to drain 7530 1727096055.70891: waiting for pending results... 7530 1727096055.71083: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7530 1727096055.71188: in run() - task 0afff68d-5257-086b-f4f0-00000000010d 7530 1727096055.71200: variable 'ansible_search_path' from source: unknown 7530 1727096055.71203: variable 'ansible_search_path' from source: unknown 7530 1727096055.71241: calling self._execute() 7530 1727096055.71319: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096055.71323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096055.71341: variable 'omit' from source: magic vars 7530 1727096055.71618: variable 'ansible_distribution_major_version' from source: facts 7530 1727096055.71628: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096055.71717: variable 'network_state' from source: role '' defaults 7530 1727096055.71727: Evaluated conditional (network_state != {}): False 7530 1727096055.71731: when evaluation is False, skipping this task 7530 1727096055.71733: _execute() done 7530 1727096055.71739: dumping result to json 7530 1727096055.71741: done dumping result, returning 7530 1727096055.71748: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-086b-f4f0-00000000010d] 7530 1727096055.71752: sending task result for task 0afff68d-5257-086b-f4f0-00000000010d 7530 1727096055.71847: done sending task result for task 0afff68d-5257-086b-f4f0-00000000010d 7530 1727096055.71850: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096055.71918: no more pending results, returning what we have 7530 1727096055.71921: results queue empty 7530 1727096055.71922: checking for any_errors_fatal 7530 1727096055.71933: done checking for any_errors_fatal 7530 1727096055.71937: checking for max_fail_percentage 7530 1727096055.71938: done checking for max_fail_percentage 7530 1727096055.71939: checking to see if all hosts have failed and the running result is not ok 7530 1727096055.71940: done checking to see if all hosts have failed 7530 1727096055.71941: getting the remaining hosts for this loop 7530 1727096055.71942: done getting the remaining hosts for this loop 7530 1727096055.71946: getting the next task for host managed_node3 7530 1727096055.71953: done getting next task for host managed_node3 7530 1727096055.71957: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096055.71960: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096055.71983: getting variables 7530 1727096055.71985: in VariableManager get_vars() 7530 1727096055.72029: Calling all_inventory to load vars for managed_node3 7530 1727096055.72031: Calling groups_inventory to load vars for managed_node3 7530 1727096055.72033: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096055.72044: Calling all_plugins_play to load vars for managed_node3 7530 1727096055.72047: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096055.72049: Calling groups_plugins_play to load vars for managed_node3 7530 1727096055.72973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096055.78786: done with get_vars() 7530 1727096055.78812: done getting variables 7530 1727096055.78855: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:54:15 -0400 (0:00:00.082) 0:00:46.577 ****** 7530 1727096055.78879: entering _queue_task() for managed_node3/fail 7530 1727096055.79152: worker is 1 (out of 1 available) 7530 1727096055.79168: exiting _queue_task() for managed_node3/fail 7530 1727096055.79181: done queuing things up, now waiting for results queue to drain 7530 1727096055.79183: waiting for pending results... 7530 1727096055.79373: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7530 1727096055.79479: in run() - task 0afff68d-5257-086b-f4f0-00000000010e 7530 1727096055.79490: variable 'ansible_search_path' from source: unknown 7530 1727096055.79494: variable 'ansible_search_path' from source: unknown 7530 1727096055.79525: calling self._execute() 7530 1727096055.79609: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096055.79613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096055.79624: variable 'omit' from source: magic vars 7530 1727096055.79913: variable 'ansible_distribution_major_version' from source: facts 7530 1727096055.79923: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096055.80010: variable 'network_state' from source: role '' defaults 7530 1727096055.80019: Evaluated conditional (network_state != {}): False 7530 1727096055.80023: when evaluation is False, skipping this task 7530 1727096055.80026: _execute() done 7530 1727096055.80029: dumping result to json 7530 1727096055.80031: done dumping result, returning 7530 1727096055.80040: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-086b-f4f0-00000000010e] 7530 1727096055.80043: sending task result for task 0afff68d-5257-086b-f4f0-00000000010e 7530 1727096055.80140: done sending task result for task 0afff68d-5257-086b-f4f0-00000000010e 7530 1727096055.80142: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096055.80223: no more pending results, returning what we have 7530 1727096055.80226: results queue empty 7530 1727096055.80227: checking for any_errors_fatal 7530 1727096055.80238: done checking for any_errors_fatal 7530 1727096055.80238: checking for max_fail_percentage 7530 1727096055.80240: done checking for max_fail_percentage 7530 1727096055.80241: checking to see if all hosts have failed and the running result is not ok 7530 1727096055.80242: done checking to see if all hosts have failed 7530 1727096055.80243: getting the remaining hosts for this loop 7530 1727096055.80244: done getting the remaining hosts for this loop 7530 1727096055.80248: getting the next task for host managed_node3 7530 1727096055.80256: done getting next task for host managed_node3 7530 1727096055.80259: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096055.80262: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096055.80283: getting variables 7530 1727096055.80285: in VariableManager get_vars() 7530 1727096055.80327: Calling all_inventory to load vars for managed_node3 7530 1727096055.80329: Calling groups_inventory to load vars for managed_node3 7530 1727096055.80331: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096055.80342: Calling all_plugins_play to load vars for managed_node3 7530 1727096055.80345: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096055.80347: Calling groups_plugins_play to load vars for managed_node3 7530 1727096055.81139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096055.82033: done with get_vars() 7530 1727096055.82060: done getting variables 7530 1727096055.82107: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:54:15 -0400 (0:00:00.032) 0:00:46.609 ****** 7530 1727096055.82138: entering _queue_task() for managed_node3/fail 7530 1727096055.82409: worker is 1 (out of 1 available) 7530 1727096055.82422: exiting _queue_task() for managed_node3/fail 7530 1727096055.82437: done queuing things up, now waiting for results queue to drain 7530 1727096055.82439: waiting for pending results... 7530 1727096055.82633: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7530 1727096055.82731: in run() - task 0afff68d-5257-086b-f4f0-00000000010f 7530 1727096055.82743: variable 'ansible_search_path' from source: unknown 7530 1727096055.82747: variable 'ansible_search_path' from source: unknown 7530 1727096055.82781: calling self._execute() 7530 1727096055.82862: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096055.82865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096055.82879: variable 'omit' from source: magic vars 7530 1727096055.83163: variable 'ansible_distribution_major_version' from source: facts 7530 1727096055.83175: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096055.83300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096055.85063: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096055.85393: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096055.85425: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096055.85455: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096055.85479: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096055.85542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.85564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.85582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.85608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.85619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.85698: variable 'ansible_distribution_major_version' from source: facts 7530 1727096055.85713: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7530 1727096055.85798: variable 'ansible_distribution' from source: facts 7530 1727096055.85802: variable '__network_rh_distros' from source: role '' defaults 7530 1727096055.85812: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7530 1727096055.85978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.85997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.86015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.86041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.86051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.86088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.86106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.86123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.86149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.86159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.86372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.86377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.86379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.86382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.86383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.86622: variable 'network_connections' from source: task vars 7530 1727096055.86641: variable 'interface' from source: play vars 7530 1727096055.86716: variable 'interface' from source: play vars 7530 1727096055.86734: variable 'network_state' from source: role '' defaults 7530 1727096055.86813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096055.86990: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096055.87035: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096055.87073: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096055.87124: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096055.87175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096055.87200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096055.87238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.87271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096055.87303: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7530 1727096055.87311: when evaluation is False, skipping this task 7530 1727096055.87318: _execute() done 7530 1727096055.87324: dumping result to json 7530 1727096055.87330: done dumping result, returning 7530 1727096055.87341: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-086b-f4f0-00000000010f] 7530 1727096055.87352: sending task result for task 0afff68d-5257-086b-f4f0-00000000010f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7530 1727096055.87504: no more pending results, returning what we have 7530 1727096055.87507: results queue empty 7530 1727096055.87508: checking for any_errors_fatal 7530 1727096055.87512: done checking for any_errors_fatal 7530 1727096055.87513: checking for max_fail_percentage 7530 1727096055.87515: done checking for max_fail_percentage 7530 1727096055.87516: checking to see if all hosts have failed and the running result is not ok 7530 1727096055.87517: done checking to see if all hosts have failed 7530 1727096055.87517: getting the remaining hosts for this loop 7530 1727096055.87519: done getting the remaining hosts for this loop 7530 1727096055.87523: getting the next task for host managed_node3 7530 1727096055.87529: done getting next task for host managed_node3 7530 1727096055.87533: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096055.87538: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096055.87557: getting variables 7530 1727096055.87558: in VariableManager get_vars() 7530 1727096055.87608: Calling all_inventory to load vars for managed_node3 7530 1727096055.87611: Calling groups_inventory to load vars for managed_node3 7530 1727096055.87613: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096055.87623: Calling all_plugins_play to load vars for managed_node3 7530 1727096055.87626: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096055.87628: Calling groups_plugins_play to load vars for managed_node3 7530 1727096055.88182: done sending task result for task 0afff68d-5257-086b-f4f0-00000000010f 7530 1727096055.88186: WORKER PROCESS EXITING 7530 1727096055.89260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096055.90459: done with get_vars() 7530 1727096055.90486: done getting variables 7530 1727096055.90531: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:54:15 -0400 (0:00:00.084) 0:00:46.693 ****** 7530 1727096055.90555: entering _queue_task() for managed_node3/dnf 7530 1727096055.90818: worker is 1 (out of 1 available) 7530 1727096055.90833: exiting _queue_task() for managed_node3/dnf 7530 1727096055.90845: done queuing things up, now waiting for results queue to drain 7530 1727096055.90847: waiting for pending results... 7530 1727096055.91042: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7530 1727096055.91145: in run() - task 0afff68d-5257-086b-f4f0-000000000110 7530 1727096055.91158: variable 'ansible_search_path' from source: unknown 7530 1727096055.91162: variable 'ansible_search_path' from source: unknown 7530 1727096055.91195: calling self._execute() 7530 1727096055.91279: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096055.91287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096055.91301: variable 'omit' from source: magic vars 7530 1727096055.91593: variable 'ansible_distribution_major_version' from source: facts 7530 1727096055.91602: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096055.91759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096055.94277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096055.94306: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096055.94357: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096055.94404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096055.94441: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096055.94532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.94574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.94608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.94663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.94873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.94877: variable 'ansible_distribution' from source: facts 7530 1727096055.94880: variable 'ansible_distribution_major_version' from source: facts 7530 1727096055.94882: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7530 1727096055.94983: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096055.95115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.95146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.95177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.95218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.95240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.95286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.95313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.95342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.95384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.95402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.95448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096055.95476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096055.95505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.95546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096055.95564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096055.95725: variable 'network_connections' from source: task vars 7530 1727096055.95749: variable 'interface' from source: play vars 7530 1727096055.95975: variable 'interface' from source: play vars 7530 1727096055.96276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096055.96466: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096055.96651: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096055.96695: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096055.96975: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096055.96979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096055.96981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096055.97039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096055.97070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096055.97128: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096055.97846: variable 'network_connections' from source: task vars 7530 1727096055.97954: variable 'interface' from source: play vars 7530 1727096055.97989: variable 'interface' from source: play vars 7530 1727096055.98023: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096055.98114: when evaluation is False, skipping this task 7530 1727096055.98159: _execute() done 7530 1727096055.98173: dumping result to json 7530 1727096055.98181: done dumping result, returning 7530 1727096055.98192: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-000000000110] 7530 1727096055.98202: sending task result for task 0afff68d-5257-086b-f4f0-000000000110 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096055.98372: no more pending results, returning what we have 7530 1727096055.98376: results queue empty 7530 1727096055.98377: checking for any_errors_fatal 7530 1727096055.98383: done checking for any_errors_fatal 7530 1727096055.98384: checking for max_fail_percentage 7530 1727096055.98386: done checking for max_fail_percentage 7530 1727096055.98387: checking to see if all hosts have failed and the running result is not ok 7530 1727096055.98389: done checking to see if all hosts have failed 7530 1727096055.98390: getting the remaining hosts for this loop 7530 1727096055.98391: done getting the remaining hosts for this loop 7530 1727096055.98395: getting the next task for host managed_node3 7530 1727096055.98402: done getting next task for host managed_node3 7530 1727096055.98405: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096055.98408: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096055.98548: getting variables 7530 1727096055.98551: in VariableManager get_vars() 7530 1727096055.98601: Calling all_inventory to load vars for managed_node3 7530 1727096055.98604: Calling groups_inventory to load vars for managed_node3 7530 1727096055.98607: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096055.98618: Calling all_plugins_play to load vars for managed_node3 7530 1727096055.98620: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096055.98623: Calling groups_plugins_play to load vars for managed_node3 7530 1727096055.99186: done sending task result for task 0afff68d-5257-086b-f4f0-000000000110 7530 1727096055.99191: WORKER PROCESS EXITING 7530 1727096056.00236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096056.03258: done with get_vars() 7530 1727096056.03497: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7530 1727096056.03599: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:54:16 -0400 (0:00:00.130) 0:00:46.824 ****** 7530 1727096056.03631: entering _queue_task() for managed_node3/yum 7530 1727096056.04318: worker is 1 (out of 1 available) 7530 1727096056.04334: exiting _queue_task() for managed_node3/yum 7530 1727096056.04347: done queuing things up, now waiting for results queue to drain 7530 1727096056.04349: waiting for pending results... 7530 1727096056.04606: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7530 1727096056.04809: in run() - task 0afff68d-5257-086b-f4f0-000000000111 7530 1727096056.04813: variable 'ansible_search_path' from source: unknown 7530 1727096056.04816: variable 'ansible_search_path' from source: unknown 7530 1727096056.04819: calling self._execute() 7530 1727096056.04928: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.04943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.04955: variable 'omit' from source: magic vars 7530 1727096056.05352: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.05372: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096056.05554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096056.10476: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096056.10482: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096056.10485: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096056.10489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096056.10576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096056.10827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.10830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.10833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.10959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.10984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.11259: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.11291: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7530 1727096056.11300: when evaluation is False, skipping this task 7530 1727096056.11308: _execute() done 7530 1727096056.11315: dumping result to json 7530 1727096056.11323: done dumping result, returning 7530 1727096056.11336: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-000000000111] 7530 1727096056.11346: sending task result for task 0afff68d-5257-086b-f4f0-000000000111 7530 1727096056.11791: done sending task result for task 0afff68d-5257-086b-f4f0-000000000111 7530 1727096056.11795: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7530 1727096056.11854: no more pending results, returning what we have 7530 1727096056.11858: results queue empty 7530 1727096056.11859: checking for any_errors_fatal 7530 1727096056.11869: done checking for any_errors_fatal 7530 1727096056.11870: checking for max_fail_percentage 7530 1727096056.11872: done checking for max_fail_percentage 7530 1727096056.11873: checking to see if all hosts have failed and the running result is not ok 7530 1727096056.11875: done checking to see if all hosts have failed 7530 1727096056.11876: getting the remaining hosts for this loop 7530 1727096056.11877: done getting the remaining hosts for this loop 7530 1727096056.11881: getting the next task for host managed_node3 7530 1727096056.11890: done getting next task for host managed_node3 7530 1727096056.11895: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096056.11898: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096056.11924: getting variables 7530 1727096056.11927: in VariableManager get_vars() 7530 1727096056.12095: Calling all_inventory to load vars for managed_node3 7530 1727096056.12099: Calling groups_inventory to load vars for managed_node3 7530 1727096056.12102: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096056.12112: Calling all_plugins_play to load vars for managed_node3 7530 1727096056.12116: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096056.12119: Calling groups_plugins_play to load vars for managed_node3 7530 1727096056.16672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096056.20656: done with get_vars() 7530 1727096056.20812: done getting variables 7530 1727096056.20989: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:54:16 -0400 (0:00:00.173) 0:00:46.998 ****** 7530 1727096056.21032: entering _queue_task() for managed_node3/fail 7530 1727096056.21781: worker is 1 (out of 1 available) 7530 1727096056.21796: exiting _queue_task() for managed_node3/fail 7530 1727096056.21811: done queuing things up, now waiting for results queue to drain 7530 1727096056.21813: waiting for pending results... 7530 1727096056.22294: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7530 1727096056.22612: in run() - task 0afff68d-5257-086b-f4f0-000000000112 7530 1727096056.22624: variable 'ansible_search_path' from source: unknown 7530 1727096056.22628: variable 'ansible_search_path' from source: unknown 7530 1727096056.22677: calling self._execute() 7530 1727096056.23094: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.23098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.23109: variable 'omit' from source: magic vars 7530 1727096056.23971: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.23985: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096056.24128: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096056.24545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096056.29610: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096056.29614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096056.29616: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096056.30074: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096056.30078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096056.30080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.30083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.30210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.30257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.30276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.30346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.30487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.30511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.30556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.30973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.30976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.30979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.30981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.30983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.30985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.31324: variable 'network_connections' from source: task vars 7530 1727096056.31349: variable 'interface' from source: play vars 7530 1727096056.31772: variable 'interface' from source: play vars 7530 1727096056.31776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096056.31919: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096056.32143: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096056.32306: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096056.32345: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096056.32399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096056.32461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096056.32494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.32525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096056.32591: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096056.32857: variable 'network_connections' from source: task vars 7530 1727096056.32872: variable 'interface' from source: play vars 7530 1727096056.32942: variable 'interface' from source: play vars 7530 1727096056.32976: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096056.32984: when evaluation is False, skipping this task 7530 1727096056.32992: _execute() done 7530 1727096056.32999: dumping result to json 7530 1727096056.33006: done dumping result, returning 7530 1727096056.33018: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-000000000112] 7530 1727096056.33027: sending task result for task 0afff68d-5257-086b-f4f0-000000000112 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096056.33198: no more pending results, returning what we have 7530 1727096056.33202: results queue empty 7530 1727096056.33203: checking for any_errors_fatal 7530 1727096056.33210: done checking for any_errors_fatal 7530 1727096056.33211: checking for max_fail_percentage 7530 1727096056.33213: done checking for max_fail_percentage 7530 1727096056.33214: checking to see if all hosts have failed and the running result is not ok 7530 1727096056.33215: done checking to see if all hosts have failed 7530 1727096056.33216: getting the remaining hosts for this loop 7530 1727096056.33217: done getting the remaining hosts for this loop 7530 1727096056.33221: getting the next task for host managed_node3 7530 1727096056.33228: done getting next task for host managed_node3 7530 1727096056.33232: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7530 1727096056.33237: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096056.33259: getting variables 7530 1727096056.33261: in VariableManager get_vars() 7530 1727096056.33507: Calling all_inventory to load vars for managed_node3 7530 1727096056.33511: Calling groups_inventory to load vars for managed_node3 7530 1727096056.33514: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096056.33526: Calling all_plugins_play to load vars for managed_node3 7530 1727096056.33529: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096056.33533: Calling groups_plugins_play to load vars for managed_node3 7530 1727096056.34203: done sending task result for task 0afff68d-5257-086b-f4f0-000000000112 7530 1727096056.34207: WORKER PROCESS EXITING 7530 1727096056.35159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096056.37285: done with get_vars() 7530 1727096056.37312: done getting variables 7530 1727096056.37378: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:54:16 -0400 (0:00:00.163) 0:00:47.162 ****** 7530 1727096056.37460: entering _queue_task() for managed_node3/package 7530 1727096056.37846: worker is 1 (out of 1 available) 7530 1727096056.37860: exiting _queue_task() for managed_node3/package 7530 1727096056.38077: done queuing things up, now waiting for results queue to drain 7530 1727096056.38079: waiting for pending results... 7530 1727096056.38198: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7530 1727096056.38371: in run() - task 0afff68d-5257-086b-f4f0-000000000113 7530 1727096056.38392: variable 'ansible_search_path' from source: unknown 7530 1727096056.38401: variable 'ansible_search_path' from source: unknown 7530 1727096056.38450: calling self._execute() 7530 1727096056.38571: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.38585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.38603: variable 'omit' from source: magic vars 7530 1727096056.38997: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.39015: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096056.39239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096056.39582: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096056.39653: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096056.39755: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096056.39985: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096056.40474: variable 'network_packages' from source: role '' defaults 7530 1727096056.40537: variable '__network_provider_setup' from source: role '' defaults 7530 1727096056.40598: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096056.40783: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096056.40873: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096056.40877: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096056.41487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096056.44060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096056.44132: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096056.44193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096056.44265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096056.44331: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096056.44514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.44551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.44581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.44629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.44651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.44711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.44748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.44779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.44821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.44849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.45093: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096056.45218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.45252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.45288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.45337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.45360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.45460: variable 'ansible_python' from source: facts 7530 1727096056.45498: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096056.45591: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096056.45681: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096056.45833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.45864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.45894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.45942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.46026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.46029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.46046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.46079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.46122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.46151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.46311: variable 'network_connections' from source: task vars 7530 1727096056.46321: variable 'interface' from source: play vars 7530 1727096056.46423: variable 'interface' from source: play vars 7530 1727096056.46514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096056.46550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096056.46674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.46677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096056.46679: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096056.46969: variable 'network_connections' from source: task vars 7530 1727096056.46980: variable 'interface' from source: play vars 7530 1727096056.47085: variable 'interface' from source: play vars 7530 1727096056.47127: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096056.47224: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096056.47562: variable 'network_connections' from source: task vars 7530 1727096056.47579: variable 'interface' from source: play vars 7530 1727096056.47665: variable 'interface' from source: play vars 7530 1727096056.47731: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096056.47998: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096056.48358: variable 'network_connections' from source: task vars 7530 1727096056.48374: variable 'interface' from source: play vars 7530 1727096056.48452: variable 'interface' from source: play vars 7530 1727096056.48515: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096056.48589: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096056.48601: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096056.48669: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096056.48897: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096056.49540: variable 'network_connections' from source: task vars 7530 1727096056.49573: variable 'interface' from source: play vars 7530 1727096056.49648: variable 'interface' from source: play vars 7530 1727096056.49661: variable 'ansible_distribution' from source: facts 7530 1727096056.49672: variable '__network_rh_distros' from source: role '' defaults 7530 1727096056.49684: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.49704: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096056.49894: variable 'ansible_distribution' from source: facts 7530 1727096056.49904: variable '__network_rh_distros' from source: role '' defaults 7530 1727096056.49952: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.49955: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096056.50109: variable 'ansible_distribution' from source: facts 7530 1727096056.50118: variable '__network_rh_distros' from source: role '' defaults 7530 1727096056.50128: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.50178: variable 'network_provider' from source: set_fact 7530 1727096056.50200: variable 'ansible_facts' from source: unknown 7530 1727096056.50962: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7530 1727096056.51032: when evaluation is False, skipping this task 7530 1727096056.51038: _execute() done 7530 1727096056.51040: dumping result to json 7530 1727096056.51043: done dumping result, returning 7530 1727096056.51045: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-086b-f4f0-000000000113] 7530 1727096056.51047: sending task result for task 0afff68d-5257-086b-f4f0-000000000113 7530 1727096056.51125: done sending task result for task 0afff68d-5257-086b-f4f0-000000000113 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7530 1727096056.51192: no more pending results, returning what we have 7530 1727096056.51196: results queue empty 7530 1727096056.51197: checking for any_errors_fatal 7530 1727096056.51205: done checking for any_errors_fatal 7530 1727096056.51206: checking for max_fail_percentage 7530 1727096056.51208: done checking for max_fail_percentage 7530 1727096056.51209: checking to see if all hosts have failed and the running result is not ok 7530 1727096056.51210: done checking to see if all hosts have failed 7530 1727096056.51211: getting the remaining hosts for this loop 7530 1727096056.51212: done getting the remaining hosts for this loop 7530 1727096056.51221: getting the next task for host managed_node3 7530 1727096056.51229: done getting next task for host managed_node3 7530 1727096056.51234: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096056.51239: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096056.51263: getting variables 7530 1727096056.51265: in VariableManager get_vars() 7530 1727096056.51321: Calling all_inventory to load vars for managed_node3 7530 1727096056.51324: Calling groups_inventory to load vars for managed_node3 7530 1727096056.51327: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096056.51341: Calling all_plugins_play to load vars for managed_node3 7530 1727096056.51345: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096056.51348: Calling groups_plugins_play to load vars for managed_node3 7530 1727096056.52123: WORKER PROCESS EXITING 7530 1727096056.53413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096056.55059: done with get_vars() 7530 1727096056.55098: done getting variables 7530 1727096056.55162: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:54:16 -0400 (0:00:00.177) 0:00:47.340 ****** 7530 1727096056.55198: entering _queue_task() for managed_node3/package 7530 1727096056.55543: worker is 1 (out of 1 available) 7530 1727096056.55555: exiting _queue_task() for managed_node3/package 7530 1727096056.55770: done queuing things up, now waiting for results queue to drain 7530 1727096056.55773: waiting for pending results... 7530 1727096056.55878: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7530 1727096056.56039: in run() - task 0afff68d-5257-086b-f4f0-000000000114 7530 1727096056.56061: variable 'ansible_search_path' from source: unknown 7530 1727096056.56072: variable 'ansible_search_path' from source: unknown 7530 1727096056.56122: calling self._execute() 7530 1727096056.56245: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.56258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.56276: variable 'omit' from source: magic vars 7530 1727096056.56694: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.56713: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096056.56847: variable 'network_state' from source: role '' defaults 7530 1727096056.56869: Evaluated conditional (network_state != {}): False 7530 1727096056.56881: when evaluation is False, skipping this task 7530 1727096056.56890: _execute() done 7530 1727096056.56898: dumping result to json 7530 1727096056.56905: done dumping result, returning 7530 1727096056.56918: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-086b-f4f0-000000000114] 7530 1727096056.56929: sending task result for task 0afff68d-5257-086b-f4f0-000000000114 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096056.57221: no more pending results, returning what we have 7530 1727096056.57225: results queue empty 7530 1727096056.57227: checking for any_errors_fatal 7530 1727096056.57238: done checking for any_errors_fatal 7530 1727096056.57239: checking for max_fail_percentage 7530 1727096056.57241: done checking for max_fail_percentage 7530 1727096056.57242: checking to see if all hosts have failed and the running result is not ok 7530 1727096056.57243: done checking to see if all hosts have failed 7530 1727096056.57244: getting the remaining hosts for this loop 7530 1727096056.57246: done getting the remaining hosts for this loop 7530 1727096056.57250: getting the next task for host managed_node3 7530 1727096056.57258: done getting next task for host managed_node3 7530 1727096056.57262: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096056.57266: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096056.57291: getting variables 7530 1727096056.57294: in VariableManager get_vars() 7530 1727096056.57352: Calling all_inventory to load vars for managed_node3 7530 1727096056.57355: Calling groups_inventory to load vars for managed_node3 7530 1727096056.57358: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096056.57579: Calling all_plugins_play to load vars for managed_node3 7530 1727096056.57584: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096056.57589: Calling groups_plugins_play to load vars for managed_node3 7530 1727096056.58283: done sending task result for task 0afff68d-5257-086b-f4f0-000000000114 7530 1727096056.58288: WORKER PROCESS EXITING 7530 1727096056.59192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096056.60807: done with get_vars() 7530 1727096056.60842: done getting variables 7530 1727096056.60906: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:54:16 -0400 (0:00:00.057) 0:00:47.397 ****** 7530 1727096056.60943: entering _queue_task() for managed_node3/package 7530 1727096056.61309: worker is 1 (out of 1 available) 7530 1727096056.61321: exiting _queue_task() for managed_node3/package 7530 1727096056.61333: done queuing things up, now waiting for results queue to drain 7530 1727096056.61337: waiting for pending results... 7530 1727096056.61658: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7530 1727096056.61830: in run() - task 0afff68d-5257-086b-f4f0-000000000115 7530 1727096056.61857: variable 'ansible_search_path' from source: unknown 7530 1727096056.61867: variable 'ansible_search_path' from source: unknown 7530 1727096056.61918: calling self._execute() 7530 1727096056.62038: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.62051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.62064: variable 'omit' from source: magic vars 7530 1727096056.62499: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.62520: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096056.62672: variable 'network_state' from source: role '' defaults 7530 1727096056.62689: Evaluated conditional (network_state != {}): False 7530 1727096056.62698: when evaluation is False, skipping this task 7530 1727096056.62705: _execute() done 7530 1727096056.62712: dumping result to json 7530 1727096056.62719: done dumping result, returning 7530 1727096056.62732: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-086b-f4f0-000000000115] 7530 1727096056.62745: sending task result for task 0afff68d-5257-086b-f4f0-000000000115 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096056.62933: no more pending results, returning what we have 7530 1727096056.62941: results queue empty 7530 1727096056.62942: checking for any_errors_fatal 7530 1727096056.62952: done checking for any_errors_fatal 7530 1727096056.62953: checking for max_fail_percentage 7530 1727096056.62956: done checking for max_fail_percentage 7530 1727096056.62957: checking to see if all hosts have failed and the running result is not ok 7530 1727096056.62959: done checking to see if all hosts have failed 7530 1727096056.62959: getting the remaining hosts for this loop 7530 1727096056.62961: done getting the remaining hosts for this loop 7530 1727096056.62966: getting the next task for host managed_node3 7530 1727096056.62975: done getting next task for host managed_node3 7530 1727096056.62981: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096056.62985: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096056.63011: getting variables 7530 1727096056.63013: in VariableManager get_vars() 7530 1727096056.63374: Calling all_inventory to load vars for managed_node3 7530 1727096056.63378: Calling groups_inventory to load vars for managed_node3 7530 1727096056.63381: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096056.63392: Calling all_plugins_play to load vars for managed_node3 7530 1727096056.63396: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096056.63399: Calling groups_plugins_play to load vars for managed_node3 7530 1727096056.64083: done sending task result for task 0afff68d-5257-086b-f4f0-000000000115 7530 1727096056.64088: WORKER PROCESS EXITING 7530 1727096056.64847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096056.66466: done with get_vars() 7530 1727096056.66501: done getting variables 7530 1727096056.66566: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:54:16 -0400 (0:00:00.056) 0:00:47.454 ****** 7530 1727096056.66604: entering _queue_task() for managed_node3/service 7530 1727096056.66968: worker is 1 (out of 1 available) 7530 1727096056.66982: exiting _queue_task() for managed_node3/service 7530 1727096056.66993: done queuing things up, now waiting for results queue to drain 7530 1727096056.66995: waiting for pending results... 7530 1727096056.67309: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7530 1727096056.67484: in run() - task 0afff68d-5257-086b-f4f0-000000000116 7530 1727096056.67507: variable 'ansible_search_path' from source: unknown 7530 1727096056.67516: variable 'ansible_search_path' from source: unknown 7530 1727096056.67564: calling self._execute() 7530 1727096056.67688: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.67700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.67716: variable 'omit' from source: magic vars 7530 1727096056.68115: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.68137: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096056.68275: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096056.68491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096056.71170: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096056.71255: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096056.71314: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096056.71362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096056.71396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096056.71488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.71524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.71562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.71608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.71627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.71688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.71715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.71747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.71795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.71815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.71865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.71972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.71976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.71978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.71980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.72161: variable 'network_connections' from source: task vars 7530 1727096056.72181: variable 'interface' from source: play vars 7530 1727096056.72264: variable 'interface' from source: play vars 7530 1727096056.72355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096056.72548: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096056.72596: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096056.72633: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096056.72686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096056.72739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096056.72973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096056.72976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.72978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096056.72981: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096056.73157: variable 'network_connections' from source: task vars 7530 1727096056.73171: variable 'interface' from source: play vars 7530 1727096056.73248: variable 'interface' from source: play vars 7530 1727096056.73287: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7530 1727096056.73296: when evaluation is False, skipping this task 7530 1727096056.73303: _execute() done 7530 1727096056.73315: dumping result to json 7530 1727096056.73322: done dumping result, returning 7530 1727096056.73337: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-086b-f4f0-000000000116] 7530 1727096056.73349: sending task result for task 0afff68d-5257-086b-f4f0-000000000116 7530 1727096056.73676: done sending task result for task 0afff68d-5257-086b-f4f0-000000000116 7530 1727096056.73686: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7530 1727096056.73731: no more pending results, returning what we have 7530 1727096056.73737: results queue empty 7530 1727096056.73738: checking for any_errors_fatal 7530 1727096056.73743: done checking for any_errors_fatal 7530 1727096056.73743: checking for max_fail_percentage 7530 1727096056.73745: done checking for max_fail_percentage 7530 1727096056.73746: checking to see if all hosts have failed and the running result is not ok 7530 1727096056.73747: done checking to see if all hosts have failed 7530 1727096056.73747: getting the remaining hosts for this loop 7530 1727096056.73749: done getting the remaining hosts for this loop 7530 1727096056.73752: getting the next task for host managed_node3 7530 1727096056.73758: done getting next task for host managed_node3 7530 1727096056.73762: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096056.73765: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096056.73790: getting variables 7530 1727096056.73793: in VariableManager get_vars() 7530 1727096056.73855: Calling all_inventory to load vars for managed_node3 7530 1727096056.73858: Calling groups_inventory to load vars for managed_node3 7530 1727096056.73861: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096056.73875: Calling all_plugins_play to load vars for managed_node3 7530 1727096056.73879: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096056.73884: Calling groups_plugins_play to load vars for managed_node3 7530 1727096056.75655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096056.77242: done with get_vars() 7530 1727096056.77277: done getting variables 7530 1727096056.77341: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:54:16 -0400 (0:00:00.107) 0:00:47.562 ****** 7530 1727096056.77380: entering _queue_task() for managed_node3/service 7530 1727096056.77748: worker is 1 (out of 1 available) 7530 1727096056.77760: exiting _queue_task() for managed_node3/service 7530 1727096056.77976: done queuing things up, now waiting for results queue to drain 7530 1727096056.77979: waiting for pending results... 7530 1727096056.78112: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7530 1727096056.78276: in run() - task 0afff68d-5257-086b-f4f0-000000000117 7530 1727096056.78297: variable 'ansible_search_path' from source: unknown 7530 1727096056.78306: variable 'ansible_search_path' from source: unknown 7530 1727096056.78355: calling self._execute() 7530 1727096056.78474: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.78487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.78501: variable 'omit' from source: magic vars 7530 1727096056.78900: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.78919: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096056.79099: variable 'network_provider' from source: set_fact 7530 1727096056.79110: variable 'network_state' from source: role '' defaults 7530 1727096056.79183: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7530 1727096056.79186: variable 'omit' from source: magic vars 7530 1727096056.79211: variable 'omit' from source: magic vars 7530 1727096056.79249: variable 'network_service_name' from source: role '' defaults 7530 1727096056.79322: variable 'network_service_name' from source: role '' defaults 7530 1727096056.79440: variable '__network_provider_setup' from source: role '' defaults 7530 1727096056.79453: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096056.79524: variable '__network_service_name_default_nm' from source: role '' defaults 7530 1727096056.79543: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096056.79618: variable '__network_packages_default_nm' from source: role '' defaults 7530 1727096056.79851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096056.82143: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096056.82274: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096056.82288: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096056.82327: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096056.82361: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096056.82474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.82491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.82519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.82576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.82657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.82660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.82683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.82709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.82754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.82781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.83049: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7530 1727096056.83198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.83230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.83263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.83317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.83418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.83461: variable 'ansible_python' from source: facts 7530 1727096056.83493: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7530 1727096056.83597: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096056.83692: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096056.83838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.83879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.83975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.83978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.83980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.84022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096056.84062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096056.84099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.84144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096056.84162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096056.84325: variable 'network_connections' from source: task vars 7530 1727096056.84340: variable 'interface' from source: play vars 7530 1727096056.84428: variable 'interface' from source: play vars 7530 1727096056.84548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096056.84845: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096056.84853: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096056.84903: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096056.84956: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096056.85030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096056.85075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096056.85111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096056.85172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096056.85213: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096056.85545: variable 'network_connections' from source: task vars 7530 1727096056.85673: variable 'interface' from source: play vars 7530 1727096056.85676: variable 'interface' from source: play vars 7530 1727096056.85683: variable '__network_packages_default_wireless' from source: role '' defaults 7530 1727096056.85758: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096056.86080: variable 'network_connections' from source: task vars 7530 1727096056.86092: variable 'interface' from source: play vars 7530 1727096056.86180: variable 'interface' from source: play vars 7530 1727096056.86212: variable '__network_packages_default_team' from source: role '' defaults 7530 1727096056.86306: variable '__network_team_connections_defined' from source: role '' defaults 7530 1727096056.86653: variable 'network_connections' from source: task vars 7530 1727096056.86677: variable 'interface' from source: play vars 7530 1727096056.86758: variable 'interface' from source: play vars 7530 1727096056.86874: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096056.86900: variable '__network_service_name_default_initscripts' from source: role '' defaults 7530 1727096056.86912: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096056.86980: variable '__network_packages_default_initscripts' from source: role '' defaults 7530 1727096056.87220: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7530 1727096056.88175: variable 'network_connections' from source: task vars 7530 1727096056.88179: variable 'interface' from source: play vars 7530 1727096056.88181: variable 'interface' from source: play vars 7530 1727096056.88183: variable 'ansible_distribution' from source: facts 7530 1727096056.88185: variable '__network_rh_distros' from source: role '' defaults 7530 1727096056.88193: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.88475: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7530 1727096056.88745: variable 'ansible_distribution' from source: facts 7530 1727096056.88757: variable '__network_rh_distros' from source: role '' defaults 7530 1727096056.88771: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.88788: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7530 1727096056.89206: variable 'ansible_distribution' from source: facts 7530 1727096056.89237: variable '__network_rh_distros' from source: role '' defaults 7530 1727096056.89248: variable 'ansible_distribution_major_version' from source: facts 7530 1727096056.89291: variable 'network_provider' from source: set_fact 7530 1727096056.89318: variable 'omit' from source: magic vars 7530 1727096056.89354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096056.89391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096056.89416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096056.89438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096056.89451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096056.89483: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096056.89490: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.89496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.89593: Set connection var ansible_pipelining to False 7530 1727096056.89606: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096056.89616: Set connection var ansible_timeout to 10 7530 1727096056.89628: Set connection var ansible_shell_executable to /bin/sh 7530 1727096056.89632: Set connection var ansible_shell_type to sh 7530 1727096056.89772: Set connection var ansible_connection to ssh 7530 1727096056.89776: variable 'ansible_shell_executable' from source: unknown 7530 1727096056.89778: variable 'ansible_connection' from source: unknown 7530 1727096056.89780: variable 'ansible_module_compression' from source: unknown 7530 1727096056.89782: variable 'ansible_shell_type' from source: unknown 7530 1727096056.89784: variable 'ansible_shell_executable' from source: unknown 7530 1727096056.89786: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096056.89788: variable 'ansible_pipelining' from source: unknown 7530 1727096056.89790: variable 'ansible_timeout' from source: unknown 7530 1727096056.89791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096056.89806: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096056.89824: variable 'omit' from source: magic vars 7530 1727096056.89846: starting attempt loop 7530 1727096056.89854: running the handler 7530 1727096056.89949: variable 'ansible_facts' from source: unknown 7530 1727096056.90722: _low_level_execute_command(): starting 7530 1727096056.90740: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096056.91939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096056.91960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096056.92286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096056.92304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096056.92358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096056.92600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096056.94322: stdout chunk (state=3): >>>/root <<< 7530 1727096056.94455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096056.94472: stdout chunk (state=3): >>><<< 7530 1727096056.94487: stderr chunk (state=3): >>><<< 7530 1727096056.94513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096056.94770: _low_level_execute_command(): starting 7530 1727096056.94776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158 `" && echo ansible-tmp-1727096056.9467564-9269-6823552581158="` echo /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158 `" ) && sleep 0' 7530 1727096056.95888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096056.95893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096056.95962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096056.95966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096056.96024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096056.96285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096056.96350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096056.98402: stdout chunk (state=3): >>>ansible-tmp-1727096056.9467564-9269-6823552581158=/root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158 <<< 7530 1727096056.98506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096056.98545: stderr chunk (state=3): >>><<< 7530 1727096056.98555: stdout chunk (state=3): >>><<< 7530 1727096056.98676: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096056.9467564-9269-6823552581158=/root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096056.98723: variable 'ansible_module_compression' from source: unknown 7530 1727096056.98786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7530 1727096056.99034: variable 'ansible_facts' from source: unknown 7530 1727096056.99587: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/AnsiballZ_systemd.py 7530 1727096056.99897: Sending initial data 7530 1727096056.99901: Sent initial data (152 bytes) 7530 1727096057.01166: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096057.01175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.01201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096057.01207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096057.01214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.01330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096057.01334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.01503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.01510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096057.03176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096057.03301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096057.03327: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpex46w4pp /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/AnsiballZ_systemd.py <<< 7530 1727096057.03331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/AnsiballZ_systemd.py" <<< 7530 1727096057.03388: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpex46w4pp" to remote "/root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/AnsiballZ_systemd.py" <<< 7530 1727096057.05580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096057.05586: stdout chunk (state=3): >>><<< 7530 1727096057.05589: stderr chunk (state=3): >>><<< 7530 1727096057.05591: done transferring module to remote 7530 1727096057.05593: _low_level_execute_command(): starting 7530 1727096057.05595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/ /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/AnsiballZ_systemd.py && sleep 0' 7530 1727096057.07213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.07248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096057.07251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.07253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.07384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096057.09285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096057.09311: stderr chunk (state=3): >>><<< 7530 1727096057.09587: stdout chunk (state=3): >>><<< 7530 1727096057.09594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096057.09600: _low_level_execute_command(): starting 7530 1727096057.09603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/AnsiballZ_systemd.py && sleep 0' 7530 1727096057.10521: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096057.10539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096057.10591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.10820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096057.10823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.10922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.10925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096057.41829: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9551872", "MemoryPeak": "10080256", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3332079616", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "205813000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", <<< 7530 1727096057.41845: stdout chunk (state=3): >>>"MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7530 1727096057.44059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096057.44088: stdout chunk (state=3): >>><<< 7530 1727096057.44092: stderr chunk (state=3): >>><<< 7530 1727096057.44138: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Mon 2024-09-23 08:51:17 EDT] ; stop_time=[n/a] ; pid=705 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9551872", "MemoryPeak": "10080256", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3332079616", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "205813000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service network.target multi-user.target shutdown.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus-broker.service system.slice dbus.socket cloud-init-local.service network-pre.target basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:51:18 EDT", "StateChangeTimestampMonotonic": "22578647", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096057.44383: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096057.44394: _low_level_execute_command(): starting 7530 1727096057.44404: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096056.9467564-9269-6823552581158/ > /dev/null 2>&1 && sleep 0' 7530 1727096057.45278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096057.45293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096057.45305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096057.45320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096057.45366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.45429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096057.45453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.45549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.45666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096057.47545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096057.47562: stdout chunk (state=3): >>><<< 7530 1727096057.47577: stderr chunk (state=3): >>><<< 7530 1727096057.47595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096057.47774: handler run complete 7530 1727096057.47777: attempt loop complete, returning result 7530 1727096057.47780: _execute() done 7530 1727096057.47782: dumping result to json 7530 1727096057.47784: done dumping result, returning 7530 1727096057.47785: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-086b-f4f0-000000000117] 7530 1727096057.47787: sending task result for task 0afff68d-5257-086b-f4f0-000000000117 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096057.48417: no more pending results, returning what we have 7530 1727096057.48420: results queue empty 7530 1727096057.48421: checking for any_errors_fatal 7530 1727096057.48426: done checking for any_errors_fatal 7530 1727096057.48427: checking for max_fail_percentage 7530 1727096057.48429: done checking for max_fail_percentage 7530 1727096057.48430: checking to see if all hosts have failed and the running result is not ok 7530 1727096057.48431: done checking to see if all hosts have failed 7530 1727096057.48432: getting the remaining hosts for this loop 7530 1727096057.48433: done getting the remaining hosts for this loop 7530 1727096057.48470: getting the next task for host managed_node3 7530 1727096057.48480: done getting next task for host managed_node3 7530 1727096057.48484: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096057.48486: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096057.48564: getting variables 7530 1727096057.48566: in VariableManager get_vars() 7530 1727096057.48624: Calling all_inventory to load vars for managed_node3 7530 1727096057.48627: Calling groups_inventory to load vars for managed_node3 7530 1727096057.48629: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096057.48642: Calling all_plugins_play to load vars for managed_node3 7530 1727096057.48645: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096057.48648: Calling groups_plugins_play to load vars for managed_node3 7530 1727096057.49478: done sending task result for task 0afff68d-5257-086b-f4f0-000000000117 7530 1727096057.49482: WORKER PROCESS EXITING 7530 1727096057.51980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096057.54347: done with get_vars() 7530 1727096057.54383: done getting variables 7530 1727096057.54450: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:54:17 -0400 (0:00:00.774) 0:00:48.336 ****** 7530 1727096057.54810: entering _queue_task() for managed_node3/service 7530 1727096057.55769: worker is 1 (out of 1 available) 7530 1727096057.55783: exiting _queue_task() for managed_node3/service 7530 1727096057.55795: done queuing things up, now waiting for results queue to drain 7530 1727096057.55797: waiting for pending results... 7530 1727096057.56086: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7530 1727096057.56614: in run() - task 0afff68d-5257-086b-f4f0-000000000118 7530 1727096057.56623: variable 'ansible_search_path' from source: unknown 7530 1727096057.56632: variable 'ansible_search_path' from source: unknown 7530 1727096057.56716: calling self._execute() 7530 1727096057.56850: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096057.56863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096057.56882: variable 'omit' from source: magic vars 7530 1727096057.57394: variable 'ansible_distribution_major_version' from source: facts 7530 1727096057.57421: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096057.57685: variable 'network_provider' from source: set_fact 7530 1727096057.57691: Evaluated conditional (network_provider == "nm"): True 7530 1727096057.57821: variable '__network_wpa_supplicant_required' from source: role '' defaults 7530 1727096057.58047: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7530 1727096057.58329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096057.61138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096057.61375: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096057.61379: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096057.61415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096057.61472: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096057.61610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096057.61614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096057.61640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096057.61742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096057.61748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096057.61753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096057.61775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096057.61798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096057.61845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096057.61849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096057.61955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096057.61958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096057.61960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096057.61979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096057.61994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096057.62161: variable 'network_connections' from source: task vars 7530 1727096057.62168: variable 'interface' from source: play vars 7530 1727096057.62241: variable 'interface' from source: play vars 7530 1727096057.62322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096057.62505: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096057.62543: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096057.62621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096057.62624: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096057.62656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7530 1727096057.62677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7530 1727096057.62708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096057.62775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7530 1727096057.62783: variable '__network_wireless_connections_defined' from source: role '' defaults 7530 1727096057.63046: variable 'network_connections' from source: task vars 7530 1727096057.63050: variable 'interface' from source: play vars 7530 1727096057.63174: variable 'interface' from source: play vars 7530 1727096057.63179: Evaluated conditional (__network_wpa_supplicant_required): False 7530 1727096057.63181: when evaluation is False, skipping this task 7530 1727096057.63183: _execute() done 7530 1727096057.63185: dumping result to json 7530 1727096057.63187: done dumping result, returning 7530 1727096057.63189: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-086b-f4f0-000000000118] 7530 1727096057.63200: sending task result for task 0afff68d-5257-086b-f4f0-000000000118 7530 1727096057.63272: done sending task result for task 0afff68d-5257-086b-f4f0-000000000118 7530 1727096057.63275: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7530 1727096057.63320: no more pending results, returning what we have 7530 1727096057.63323: results queue empty 7530 1727096057.63324: checking for any_errors_fatal 7530 1727096057.63349: done checking for any_errors_fatal 7530 1727096057.63350: checking for max_fail_percentage 7530 1727096057.63352: done checking for max_fail_percentage 7530 1727096057.63353: checking to see if all hosts have failed and the running result is not ok 7530 1727096057.63354: done checking to see if all hosts have failed 7530 1727096057.63355: getting the remaining hosts for this loop 7530 1727096057.63356: done getting the remaining hosts for this loop 7530 1727096057.63361: getting the next task for host managed_node3 7530 1727096057.63371: done getting next task for host managed_node3 7530 1727096057.63377: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096057.63381: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096057.63412: getting variables 7530 1727096057.63415: in VariableManager get_vars() 7530 1727096057.63616: Calling all_inventory to load vars for managed_node3 7530 1727096057.63619: Calling groups_inventory to load vars for managed_node3 7530 1727096057.63621: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096057.63631: Calling all_plugins_play to load vars for managed_node3 7530 1727096057.63633: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096057.63636: Calling groups_plugins_play to load vars for managed_node3 7530 1727096057.64995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096057.66617: done with get_vars() 7530 1727096057.66653: done getting variables 7530 1727096057.66718: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:54:17 -0400 (0:00:00.119) 0:00:48.455 ****** 7530 1727096057.66755: entering _queue_task() for managed_node3/service 7530 1727096057.67136: worker is 1 (out of 1 available) 7530 1727096057.67149: exiting _queue_task() for managed_node3/service 7530 1727096057.67162: done queuing things up, now waiting for results queue to drain 7530 1727096057.67164: waiting for pending results... 7530 1727096057.67817: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7530 1727096057.67826: in run() - task 0afff68d-5257-086b-f4f0-000000000119 7530 1727096057.67859: variable 'ansible_search_path' from source: unknown 7530 1727096057.67871: variable 'ansible_search_path' from source: unknown 7530 1727096057.67926: calling self._execute() 7530 1727096057.68108: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096057.68121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096057.68160: variable 'omit' from source: magic vars 7530 1727096057.68652: variable 'ansible_distribution_major_version' from source: facts 7530 1727096057.68664: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096057.68981: variable 'network_provider' from source: set_fact 7530 1727096057.68985: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096057.68987: when evaluation is False, skipping this task 7530 1727096057.68990: _execute() done 7530 1727096057.68992: dumping result to json 7530 1727096057.68994: done dumping result, returning 7530 1727096057.68997: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-086b-f4f0-000000000119] 7530 1727096057.68999: sending task result for task 0afff68d-5257-086b-f4f0-000000000119 7530 1727096057.69087: done sending task result for task 0afff68d-5257-086b-f4f0-000000000119 7530 1727096057.69091: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7530 1727096057.69145: no more pending results, returning what we have 7530 1727096057.69149: results queue empty 7530 1727096057.69150: checking for any_errors_fatal 7530 1727096057.69160: done checking for any_errors_fatal 7530 1727096057.69161: checking for max_fail_percentage 7530 1727096057.69163: done checking for max_fail_percentage 7530 1727096057.69164: checking to see if all hosts have failed and the running result is not ok 7530 1727096057.69165: done checking to see if all hosts have failed 7530 1727096057.69166: getting the remaining hosts for this loop 7530 1727096057.69170: done getting the remaining hosts for this loop 7530 1727096057.69174: getting the next task for host managed_node3 7530 1727096057.69182: done getting next task for host managed_node3 7530 1727096057.69187: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096057.69191: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096057.69220: getting variables 7530 1727096057.69222: in VariableManager get_vars() 7530 1727096057.69411: Calling all_inventory to load vars for managed_node3 7530 1727096057.69415: Calling groups_inventory to load vars for managed_node3 7530 1727096057.69418: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096057.69431: Calling all_plugins_play to load vars for managed_node3 7530 1727096057.69437: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096057.69441: Calling groups_plugins_play to load vars for managed_node3 7530 1727096057.71550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096057.73806: done with get_vars() 7530 1727096057.73850: done getting variables 7530 1727096057.73926: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:54:17 -0400 (0:00:00.072) 0:00:48.528 ****** 7530 1727096057.73971: entering _queue_task() for managed_node3/copy 7530 1727096057.74455: worker is 1 (out of 1 available) 7530 1727096057.74469: exiting _queue_task() for managed_node3/copy 7530 1727096057.74480: done queuing things up, now waiting for results queue to drain 7530 1727096057.74482: waiting for pending results... 7530 1727096057.74782: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7530 1727096057.74974: in run() - task 0afff68d-5257-086b-f4f0-00000000011a 7530 1727096057.74987: variable 'ansible_search_path' from source: unknown 7530 1727096057.74990: variable 'ansible_search_path' from source: unknown 7530 1727096057.75002: calling self._execute() 7530 1727096057.75177: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096057.75214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096057.75274: variable 'omit' from source: magic vars 7530 1727096057.75862: variable 'ansible_distribution_major_version' from source: facts 7530 1727096057.75872: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096057.75978: variable 'network_provider' from source: set_fact 7530 1727096057.75987: Evaluated conditional (network_provider == "initscripts"): False 7530 1727096057.75991: when evaluation is False, skipping this task 7530 1727096057.75994: _execute() done 7530 1727096057.75997: dumping result to json 7530 1727096057.75999: done dumping result, returning 7530 1727096057.76008: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-086b-f4f0-00000000011a] 7530 1727096057.76013: sending task result for task 0afff68d-5257-086b-f4f0-00000000011a skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7530 1727096057.76199: no more pending results, returning what we have 7530 1727096057.76205: results queue empty 7530 1727096057.76206: checking for any_errors_fatal 7530 1727096057.76214: done checking for any_errors_fatal 7530 1727096057.76214: checking for max_fail_percentage 7530 1727096057.76216: done checking for max_fail_percentage 7530 1727096057.76217: checking to see if all hosts have failed and the running result is not ok 7530 1727096057.76218: done checking to see if all hosts have failed 7530 1727096057.76219: getting the remaining hosts for this loop 7530 1727096057.76220: done getting the remaining hosts for this loop 7530 1727096057.76224: getting the next task for host managed_node3 7530 1727096057.76230: done getting next task for host managed_node3 7530 1727096057.76234: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096057.76239: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096057.76260: getting variables 7530 1727096057.76262: in VariableManager get_vars() 7530 1727096057.76322: Calling all_inventory to load vars for managed_node3 7530 1727096057.76327: Calling groups_inventory to load vars for managed_node3 7530 1727096057.76330: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096057.76345: Calling all_plugins_play to load vars for managed_node3 7530 1727096057.76349: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096057.76354: Calling groups_plugins_play to load vars for managed_node3 7530 1727096057.76884: done sending task result for task 0afff68d-5257-086b-f4f0-00000000011a 7530 1727096057.76888: WORKER PROCESS EXITING 7530 1727096057.77216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096057.78592: done with get_vars() 7530 1727096057.78652: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:54:17 -0400 (0:00:00.047) 0:00:48.575 ****** 7530 1727096057.78767: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096057.79067: worker is 1 (out of 1 available) 7530 1727096057.79084: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7530 1727096057.79098: done queuing things up, now waiting for results queue to drain 7530 1727096057.79100: waiting for pending results... 7530 1727096057.79288: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7530 1727096057.79388: in run() - task 0afff68d-5257-086b-f4f0-00000000011b 7530 1727096057.79401: variable 'ansible_search_path' from source: unknown 7530 1727096057.79405: variable 'ansible_search_path' from source: unknown 7530 1727096057.79434: calling self._execute() 7530 1727096057.79517: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096057.79522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096057.79530: variable 'omit' from source: magic vars 7530 1727096057.79827: variable 'ansible_distribution_major_version' from source: facts 7530 1727096057.79839: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096057.79843: variable 'omit' from source: magic vars 7530 1727096057.79887: variable 'omit' from source: magic vars 7530 1727096057.80007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096057.82275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096057.82292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096057.82340: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096057.82384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096057.82416: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096057.82514: variable 'network_provider' from source: set_fact 7530 1727096057.82636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096057.82658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096057.82677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096057.82705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096057.82736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096057.82784: variable 'omit' from source: magic vars 7530 1727096057.82873: variable 'omit' from source: magic vars 7530 1727096057.82977: variable 'network_connections' from source: task vars 7530 1727096057.82987: variable 'interface' from source: play vars 7530 1727096057.83033: variable 'interface' from source: play vars 7530 1727096057.83177: variable 'omit' from source: magic vars 7530 1727096057.83184: variable '__lsr_ansible_managed' from source: task vars 7530 1727096057.83225: variable '__lsr_ansible_managed' from source: task vars 7530 1727096057.83490: Loaded config def from plugin (lookup/template) 7530 1727096057.83499: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7530 1727096057.83529: File lookup term: get_ansible_managed.j2 7530 1727096057.83535: variable 'ansible_search_path' from source: unknown 7530 1727096057.83546: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7530 1727096057.83560: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7530 1727096057.83585: variable 'ansible_search_path' from source: unknown 7530 1727096057.88734: variable 'ansible_managed' from source: unknown 7530 1727096057.88826: variable 'omit' from source: magic vars 7530 1727096057.88856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096057.88879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096057.88893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096057.88906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096057.88914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096057.88935: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096057.88945: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096057.88948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096057.89012: Set connection var ansible_pipelining to False 7530 1727096057.89016: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096057.89032: Set connection var ansible_timeout to 10 7530 1727096057.89272: Set connection var ansible_shell_executable to /bin/sh 7530 1727096057.89276: Set connection var ansible_shell_type to sh 7530 1727096057.89278: Set connection var ansible_connection to ssh 7530 1727096057.89280: variable 'ansible_shell_executable' from source: unknown 7530 1727096057.89282: variable 'ansible_connection' from source: unknown 7530 1727096057.89283: variable 'ansible_module_compression' from source: unknown 7530 1727096057.89285: variable 'ansible_shell_type' from source: unknown 7530 1727096057.89287: variable 'ansible_shell_executable' from source: unknown 7530 1727096057.89289: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096057.89293: variable 'ansible_pipelining' from source: unknown 7530 1727096057.89296: variable 'ansible_timeout' from source: unknown 7530 1727096057.89298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096057.89301: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096057.89316: variable 'omit' from source: magic vars 7530 1727096057.89318: starting attempt loop 7530 1727096057.89320: running the handler 7530 1727096057.89322: _low_level_execute_command(): starting 7530 1727096057.89324: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096057.90098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096057.90122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096057.90140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096057.90214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.90229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.90279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096057.91999: stdout chunk (state=3): >>>/root <<< 7530 1727096057.92095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096057.92141: stderr chunk (state=3): >>><<< 7530 1727096057.92146: stdout chunk (state=3): >>><<< 7530 1727096057.92162: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096057.92175: _low_level_execute_command(): starting 7530 1727096057.92184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058 `" && echo ansible-tmp-1727096057.9216282-9306-116342000982058="` echo /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058 `" ) && sleep 0' 7530 1727096057.92646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096057.92650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096057.92683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.92686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096057.92689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096057.92691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.92746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096057.92749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.92751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.92801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096057.94843: stdout chunk (state=3): >>>ansible-tmp-1727096057.9216282-9306-116342000982058=/root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058 <<< 7530 1727096057.94952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096057.94981: stderr chunk (state=3): >>><<< 7530 1727096057.94986: stdout chunk (state=3): >>><<< 7530 1727096057.95003: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096057.9216282-9306-116342000982058=/root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096057.95049: variable 'ansible_module_compression' from source: unknown 7530 1727096057.95088: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7530 1727096057.95132: variable 'ansible_facts' from source: unknown 7530 1727096057.95226: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/AnsiballZ_network_connections.py 7530 1727096057.95332: Sending initial data 7530 1727096057.95337: Sent initial data (166 bytes) 7530 1727096057.95822: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096057.95826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096057.95833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.95835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096057.95837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096057.95839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.95909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096057.95912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.95922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.95956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096057.97649: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096057.97678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096057.97704: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp15nx332m /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/AnsiballZ_network_connections.py <<< 7530 1727096057.97719: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/AnsiballZ_network_connections.py" <<< 7530 1727096057.97748: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp15nx332m" to remote "/root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/AnsiballZ_network_connections.py" <<< 7530 1727096057.97752: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/AnsiballZ_network_connections.py" <<< 7530 1727096057.98463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096057.98512: stderr chunk (state=3): >>><<< 7530 1727096057.98517: stdout chunk (state=3): >>><<< 7530 1727096057.98566: done transferring module to remote 7530 1727096057.98578: _low_level_execute_command(): starting 7530 1727096057.98582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/ /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/AnsiballZ_network_connections.py && sleep 0' 7530 1727096057.99065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096057.99071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096057.99074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.99092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096057.99141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096057.99144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096057.99153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096057.99201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.01116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096058.01148: stderr chunk (state=3): >>><<< 7530 1727096058.01151: stdout chunk (state=3): >>><<< 7530 1727096058.01166: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096058.01171: _low_level_execute_command(): starting 7530 1727096058.01176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/AnsiballZ_network_connections.py && sleep 0' 7530 1727096058.01654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096058.01658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096058.01661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.01679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.01743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096058.01757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096058.01760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096058.01820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.35376: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_04zl7386/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_04zl7386/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab: error=unknown <<< 7530 1727096058.35750: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7530 1727096058.37675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096058.37726: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096058.37739: stderr chunk (state=3): >>><<< 7530 1727096058.37747: stdout chunk (state=3): >>><<< 7530 1727096058.37774: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_04zl7386/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_04zl7386/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/bf9fa6e3-ccc0-4627-ac8e-461ec69e8aab: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096058.37906: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096058.37909: _low_level_execute_command(): starting 7530 1727096058.37911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096057.9216282-9306-116342000982058/ > /dev/null 2>&1 && sleep 0' 7530 1727096058.38502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096058.38550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096058.38566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096058.38650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.38704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096058.38721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096058.38761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096058.38843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.40763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096058.40769: stdout chunk (state=3): >>><<< 7530 1727096058.40974: stderr chunk (state=3): >>><<< 7530 1727096058.40978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096058.40987: handler run complete 7530 1727096058.40989: attempt loop complete, returning result 7530 1727096058.40991: _execute() done 7530 1727096058.40992: dumping result to json 7530 1727096058.40994: done dumping result, returning 7530 1727096058.40996: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-086b-f4f0-00000000011b] 7530 1727096058.40997: sending task result for task 0afff68d-5257-086b-f4f0-00000000011b 7530 1727096058.41074: done sending task result for task 0afff68d-5257-086b-f4f0-00000000011b 7530 1727096058.41077: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7530 1727096058.41206: no more pending results, returning what we have 7530 1727096058.41210: results queue empty 7530 1727096058.41211: checking for any_errors_fatal 7530 1727096058.41217: done checking for any_errors_fatal 7530 1727096058.41218: checking for max_fail_percentage 7530 1727096058.41219: done checking for max_fail_percentage 7530 1727096058.41220: checking to see if all hosts have failed and the running result is not ok 7530 1727096058.41221: done checking to see if all hosts have failed 7530 1727096058.41222: getting the remaining hosts for this loop 7530 1727096058.41223: done getting the remaining hosts for this loop 7530 1727096058.41227: getting the next task for host managed_node3 7530 1727096058.41232: done getting next task for host managed_node3 7530 1727096058.41236: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096058.41239: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096058.41251: getting variables 7530 1727096058.41253: in VariableManager get_vars() 7530 1727096058.41412: Calling all_inventory to load vars for managed_node3 7530 1727096058.41415: Calling groups_inventory to load vars for managed_node3 7530 1727096058.41417: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096058.41428: Calling all_plugins_play to load vars for managed_node3 7530 1727096058.41571: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096058.41577: Calling groups_plugins_play to load vars for managed_node3 7530 1727096058.43136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096058.44758: done with get_vars() 7530 1727096058.44804: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:54:18 -0400 (0:00:00.661) 0:00:49.237 ****** 7530 1727096058.44907: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096058.45332: worker is 1 (out of 1 available) 7530 1727096058.45345: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7530 1727096058.45356: done queuing things up, now waiting for results queue to drain 7530 1727096058.45357: waiting for pending results... 7530 1727096058.45613: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7530 1727096058.45768: in run() - task 0afff68d-5257-086b-f4f0-00000000011c 7530 1727096058.45792: variable 'ansible_search_path' from source: unknown 7530 1727096058.45800: variable 'ansible_search_path' from source: unknown 7530 1727096058.45844: calling self._execute() 7530 1727096058.45963: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.45985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.46000: variable 'omit' from source: magic vars 7530 1727096058.46406: variable 'ansible_distribution_major_version' from source: facts 7530 1727096058.46430: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096058.46562: variable 'network_state' from source: role '' defaults 7530 1727096058.46630: Evaluated conditional (network_state != {}): False 7530 1727096058.46637: when evaluation is False, skipping this task 7530 1727096058.46639: _execute() done 7530 1727096058.46642: dumping result to json 7530 1727096058.46644: done dumping result, returning 7530 1727096058.46646: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-086b-f4f0-00000000011c] 7530 1727096058.46648: sending task result for task 0afff68d-5257-086b-f4f0-00000000011c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7530 1727096058.46806: no more pending results, returning what we have 7530 1727096058.46811: results queue empty 7530 1727096058.46813: checking for any_errors_fatal 7530 1727096058.46828: done checking for any_errors_fatal 7530 1727096058.46829: checking for max_fail_percentage 7530 1727096058.46832: done checking for max_fail_percentage 7530 1727096058.46833: checking to see if all hosts have failed and the running result is not ok 7530 1727096058.46834: done checking to see if all hosts have failed 7530 1727096058.46835: getting the remaining hosts for this loop 7530 1727096058.46836: done getting the remaining hosts for this loop 7530 1727096058.46840: getting the next task for host managed_node3 7530 1727096058.46851: done getting next task for host managed_node3 7530 1727096058.46856: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096058.46860: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096058.46889: getting variables 7530 1727096058.46891: in VariableManager get_vars() 7530 1727096058.46946: Calling all_inventory to load vars for managed_node3 7530 1727096058.46949: Calling groups_inventory to load vars for managed_node3 7530 1727096058.46952: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096058.46965: Calling all_plugins_play to load vars for managed_node3 7530 1727096058.47186: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096058.47191: Calling groups_plugins_play to load vars for managed_node3 7530 1727096058.47801: done sending task result for task 0afff68d-5257-086b-f4f0-00000000011c 7530 1727096058.47805: WORKER PROCESS EXITING 7530 1727096058.48679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096058.50482: done with get_vars() 7530 1727096058.50505: done getting variables 7530 1727096058.50573: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:54:18 -0400 (0:00:00.056) 0:00:49.294 ****** 7530 1727096058.50608: entering _queue_task() for managed_node3/debug 7530 1727096058.51016: worker is 1 (out of 1 available) 7530 1727096058.51029: exiting _queue_task() for managed_node3/debug 7530 1727096058.51039: done queuing things up, now waiting for results queue to drain 7530 1727096058.51041: waiting for pending results... 7530 1727096058.51299: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7530 1727096058.51452: in run() - task 0afff68d-5257-086b-f4f0-00000000011d 7530 1727096058.51476: variable 'ansible_search_path' from source: unknown 7530 1727096058.51483: variable 'ansible_search_path' from source: unknown 7530 1727096058.51527: calling self._execute() 7530 1727096058.51639: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.51655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.51672: variable 'omit' from source: magic vars 7530 1727096058.52076: variable 'ansible_distribution_major_version' from source: facts 7530 1727096058.52173: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096058.52177: variable 'omit' from source: magic vars 7530 1727096058.52180: variable 'omit' from source: magic vars 7530 1727096058.52274: variable 'omit' from source: magic vars 7530 1727096058.52277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096058.52280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096058.52317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096058.52340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096058.52358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096058.52406: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096058.52416: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.52425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.52543: Set connection var ansible_pipelining to False 7530 1727096058.52557: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096058.52571: Set connection var ansible_timeout to 10 7530 1727096058.52586: Set connection var ansible_shell_executable to /bin/sh 7530 1727096058.52594: Set connection var ansible_shell_type to sh 7530 1727096058.52600: Set connection var ansible_connection to ssh 7530 1727096058.52638: variable 'ansible_shell_executable' from source: unknown 7530 1727096058.52732: variable 'ansible_connection' from source: unknown 7530 1727096058.52736: variable 'ansible_module_compression' from source: unknown 7530 1727096058.52738: variable 'ansible_shell_type' from source: unknown 7530 1727096058.52741: variable 'ansible_shell_executable' from source: unknown 7530 1727096058.52743: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.52745: variable 'ansible_pipelining' from source: unknown 7530 1727096058.52747: variable 'ansible_timeout' from source: unknown 7530 1727096058.52748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.52846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096058.52866: variable 'omit' from source: magic vars 7530 1727096058.52879: starting attempt loop 7530 1727096058.52887: running the handler 7530 1727096058.53038: variable '__network_connections_result' from source: set_fact 7530 1727096058.53105: handler run complete 7530 1727096058.53131: attempt loop complete, returning result 7530 1727096058.53138: _execute() done 7530 1727096058.53146: dumping result to json 7530 1727096058.53153: done dumping result, returning 7530 1727096058.53175: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-086b-f4f0-00000000011d] 7530 1727096058.53186: sending task result for task 0afff68d-5257-086b-f4f0-00000000011d 7530 1727096058.53347: done sending task result for task 0afff68d-5257-086b-f4f0-00000000011d 7530 1727096058.53352: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7530 1727096058.53452: no more pending results, returning what we have 7530 1727096058.53456: results queue empty 7530 1727096058.53457: checking for any_errors_fatal 7530 1727096058.53465: done checking for any_errors_fatal 7530 1727096058.53465: checking for max_fail_percentage 7530 1727096058.53469: done checking for max_fail_percentage 7530 1727096058.53470: checking to see if all hosts have failed and the running result is not ok 7530 1727096058.53472: done checking to see if all hosts have failed 7530 1727096058.53472: getting the remaining hosts for this loop 7530 1727096058.53474: done getting the remaining hosts for this loop 7530 1727096058.53478: getting the next task for host managed_node3 7530 1727096058.53600: done getting next task for host managed_node3 7530 1727096058.53606: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096058.53609: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096058.53625: getting variables 7530 1727096058.53627: in VariableManager get_vars() 7530 1727096058.53722: Calling all_inventory to load vars for managed_node3 7530 1727096058.53726: Calling groups_inventory to load vars for managed_node3 7530 1727096058.53728: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096058.53739: Calling all_plugins_play to load vars for managed_node3 7530 1727096058.53743: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096058.53746: Calling groups_plugins_play to load vars for managed_node3 7530 1727096058.55264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096058.56831: done with get_vars() 7530 1727096058.56861: done getting variables 7530 1727096058.56933: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:54:18 -0400 (0:00:00.063) 0:00:49.358 ****** 7530 1727096058.56972: entering _queue_task() for managed_node3/debug 7530 1727096058.57578: worker is 1 (out of 1 available) 7530 1727096058.57587: exiting _queue_task() for managed_node3/debug 7530 1727096058.57597: done queuing things up, now waiting for results queue to drain 7530 1727096058.57599: waiting for pending results... 7530 1727096058.57731: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7530 1727096058.57834: in run() - task 0afff68d-5257-086b-f4f0-00000000011e 7530 1727096058.57936: variable 'ansible_search_path' from source: unknown 7530 1727096058.57940: variable 'ansible_search_path' from source: unknown 7530 1727096058.57943: calling self._execute() 7530 1727096058.58019: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.58033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.58056: variable 'omit' from source: magic vars 7530 1727096058.58452: variable 'ansible_distribution_major_version' from source: facts 7530 1727096058.58479: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096058.58493: variable 'omit' from source: magic vars 7530 1727096058.58557: variable 'omit' from source: magic vars 7530 1727096058.58612: variable 'omit' from source: magic vars 7530 1727096058.58660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096058.58709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096058.58801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096058.58805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096058.58808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096058.58810: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096058.58817: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.58825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.58939: Set connection var ansible_pipelining to False 7530 1727096058.58951: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096058.58962: Set connection var ansible_timeout to 10 7530 1727096058.58977: Set connection var ansible_shell_executable to /bin/sh 7530 1727096058.58984: Set connection var ansible_shell_type to sh 7530 1727096058.58991: Set connection var ansible_connection to ssh 7530 1727096058.59030: variable 'ansible_shell_executable' from source: unknown 7530 1727096058.59038: variable 'ansible_connection' from source: unknown 7530 1727096058.59046: variable 'ansible_module_compression' from source: unknown 7530 1727096058.59052: variable 'ansible_shell_type' from source: unknown 7530 1727096058.59126: variable 'ansible_shell_executable' from source: unknown 7530 1727096058.59129: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.59132: variable 'ansible_pipelining' from source: unknown 7530 1727096058.59134: variable 'ansible_timeout' from source: unknown 7530 1727096058.59136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.59239: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096058.59257: variable 'omit' from source: magic vars 7530 1727096058.59267: starting attempt loop 7530 1727096058.59276: running the handler 7530 1727096058.59329: variable '__network_connections_result' from source: set_fact 7530 1727096058.59424: variable '__network_connections_result' from source: set_fact 7530 1727096058.59550: handler run complete 7530 1727096058.59589: attempt loop complete, returning result 7530 1727096058.59597: _execute() done 7530 1727096058.59672: dumping result to json 7530 1727096058.59675: done dumping result, returning 7530 1727096058.59678: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-086b-f4f0-00000000011e] 7530 1727096058.59680: sending task result for task 0afff68d-5257-086b-f4f0-00000000011e 7530 1727096058.59755: done sending task result for task 0afff68d-5257-086b-f4f0-00000000011e 7530 1727096058.59758: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7530 1727096058.59865: no more pending results, returning what we have 7530 1727096058.59870: results queue empty 7530 1727096058.59872: checking for any_errors_fatal 7530 1727096058.59885: done checking for any_errors_fatal 7530 1727096058.59886: checking for max_fail_percentage 7530 1727096058.59888: done checking for max_fail_percentage 7530 1727096058.59889: checking to see if all hosts have failed and the running result is not ok 7530 1727096058.59891: done checking to see if all hosts have failed 7530 1727096058.59891: getting the remaining hosts for this loop 7530 1727096058.59893: done getting the remaining hosts for this loop 7530 1727096058.59897: getting the next task for host managed_node3 7530 1727096058.59904: done getting next task for host managed_node3 7530 1727096058.59908: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096058.59912: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096058.59927: getting variables 7530 1727096058.59929: in VariableManager get_vars() 7530 1727096058.60087: Calling all_inventory to load vars for managed_node3 7530 1727096058.60090: Calling groups_inventory to load vars for managed_node3 7530 1727096058.60093: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096058.60104: Calling all_plugins_play to load vars for managed_node3 7530 1727096058.60108: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096058.60111: Calling groups_plugins_play to load vars for managed_node3 7530 1727096058.61891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096058.63513: done with get_vars() 7530 1727096058.63548: done getting variables 7530 1727096058.63618: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:54:18 -0400 (0:00:00.066) 0:00:49.425 ****** 7530 1727096058.63673: entering _queue_task() for managed_node3/debug 7530 1727096058.64118: worker is 1 (out of 1 available) 7530 1727096058.64131: exiting _queue_task() for managed_node3/debug 7530 1727096058.64144: done queuing things up, now waiting for results queue to drain 7530 1727096058.64146: waiting for pending results... 7530 1727096058.64493: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7530 1727096058.64775: in run() - task 0afff68d-5257-086b-f4f0-00000000011f 7530 1727096058.64779: variable 'ansible_search_path' from source: unknown 7530 1727096058.64782: variable 'ansible_search_path' from source: unknown 7530 1727096058.64785: calling self._execute() 7530 1727096058.64805: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.64817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.64834: variable 'omit' from source: magic vars 7530 1727096058.65258: variable 'ansible_distribution_major_version' from source: facts 7530 1727096058.65278: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096058.65399: variable 'network_state' from source: role '' defaults 7530 1727096058.65416: Evaluated conditional (network_state != {}): False 7530 1727096058.65423: when evaluation is False, skipping this task 7530 1727096058.65429: _execute() done 7530 1727096058.65435: dumping result to json 7530 1727096058.65449: done dumping result, returning 7530 1727096058.65462: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-086b-f4f0-00000000011f] 7530 1727096058.65474: sending task result for task 0afff68d-5257-086b-f4f0-00000000011f skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7530 1727096058.65746: no more pending results, returning what we have 7530 1727096058.65751: results queue empty 7530 1727096058.65752: checking for any_errors_fatal 7530 1727096058.65763: done checking for any_errors_fatal 7530 1727096058.65764: checking for max_fail_percentage 7530 1727096058.65766: done checking for max_fail_percentage 7530 1727096058.65769: checking to see if all hosts have failed and the running result is not ok 7530 1727096058.65770: done checking to see if all hosts have failed 7530 1727096058.65771: getting the remaining hosts for this loop 7530 1727096058.65772: done getting the remaining hosts for this loop 7530 1727096058.65781: getting the next task for host managed_node3 7530 1727096058.65789: done getting next task for host managed_node3 7530 1727096058.65793: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096058.65796: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096058.65808: done sending task result for task 0afff68d-5257-086b-f4f0-00000000011f 7530 1727096058.65810: WORKER PROCESS EXITING 7530 1727096058.65829: getting variables 7530 1727096058.65831: in VariableManager get_vars() 7530 1727096058.66017: Calling all_inventory to load vars for managed_node3 7530 1727096058.66020: Calling groups_inventory to load vars for managed_node3 7530 1727096058.66023: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096058.66034: Calling all_plugins_play to load vars for managed_node3 7530 1727096058.66038: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096058.66041: Calling groups_plugins_play to load vars for managed_node3 7530 1727096058.67431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096058.69169: done with get_vars() 7530 1727096058.69206: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:54:18 -0400 (0:00:00.056) 0:00:49.481 ****** 7530 1727096058.69330: entering _queue_task() for managed_node3/ping 7530 1727096058.69805: worker is 1 (out of 1 available) 7530 1727096058.69818: exiting _queue_task() for managed_node3/ping 7530 1727096058.69829: done queuing things up, now waiting for results queue to drain 7530 1727096058.69830: waiting for pending results... 7530 1727096058.70072: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7530 1727096058.70230: in run() - task 0afff68d-5257-086b-f4f0-000000000120 7530 1727096058.70251: variable 'ansible_search_path' from source: unknown 7530 1727096058.70258: variable 'ansible_search_path' from source: unknown 7530 1727096058.70305: calling self._execute() 7530 1727096058.70414: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.70425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.70441: variable 'omit' from source: magic vars 7530 1727096058.70974: variable 'ansible_distribution_major_version' from source: facts 7530 1727096058.70977: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096058.70980: variable 'omit' from source: magic vars 7530 1727096058.70982: variable 'omit' from source: magic vars 7530 1727096058.70983: variable 'omit' from source: magic vars 7530 1727096058.71001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096058.71041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096058.71066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096058.71091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096058.71116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096058.71150: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096058.71158: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.71166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.71278: Set connection var ansible_pipelining to False 7530 1727096058.71290: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096058.71299: Set connection var ansible_timeout to 10 7530 1727096058.71318: Set connection var ansible_shell_executable to /bin/sh 7530 1727096058.71326: Set connection var ansible_shell_type to sh 7530 1727096058.71333: Set connection var ansible_connection to ssh 7530 1727096058.71427: variable 'ansible_shell_executable' from source: unknown 7530 1727096058.71430: variable 'ansible_connection' from source: unknown 7530 1727096058.71433: variable 'ansible_module_compression' from source: unknown 7530 1727096058.71435: variable 'ansible_shell_type' from source: unknown 7530 1727096058.71437: variable 'ansible_shell_executable' from source: unknown 7530 1727096058.71438: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096058.71440: variable 'ansible_pipelining' from source: unknown 7530 1727096058.71442: variable 'ansible_timeout' from source: unknown 7530 1727096058.71444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096058.71630: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7530 1727096058.71655: variable 'omit' from source: magic vars 7530 1727096058.71665: starting attempt loop 7530 1727096058.71675: running the handler 7530 1727096058.71694: _low_level_execute_command(): starting 7530 1727096058.71707: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096058.72528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.72555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096058.72604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096058.72656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.74329: stdout chunk (state=3): >>>/root <<< 7530 1727096058.74457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096058.74472: stderr chunk (state=3): >>><<< 7530 1727096058.74475: stdout chunk (state=3): >>><<< 7530 1727096058.74506: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096058.74583: _low_level_execute_command(): starting 7530 1727096058.74587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933 `" && echo ansible-tmp-1727096058.7450602-9334-66110697395933="` echo /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933 `" ) && sleep 0' 7530 1727096058.75267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096058.75291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096058.75307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096058.75402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.77332: stdout chunk (state=3): >>>ansible-tmp-1727096058.7450602-9334-66110697395933=/root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933 <<< 7530 1727096058.77437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096058.77474: stderr chunk (state=3): >>><<< 7530 1727096058.77477: stdout chunk (state=3): >>><<< 7530 1727096058.77494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096058.7450602-9334-66110697395933=/root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096058.77673: variable 'ansible_module_compression' from source: unknown 7530 1727096058.77676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7530 1727096058.77678: variable 'ansible_facts' from source: unknown 7530 1727096058.77725: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/AnsiballZ_ping.py 7530 1727096058.77971: Sending initial data 7530 1727096058.77982: Sent initial data (150 bytes) 7530 1727096058.78539: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096058.78554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096058.78570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.78632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096058.78681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096058.78699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.80291: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096058.80315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096058.80348: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpg7hco0ve /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/AnsiballZ_ping.py <<< 7530 1727096058.80351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/AnsiballZ_ping.py" <<< 7530 1727096058.80379: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpg7hco0ve" to remote "/root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/AnsiballZ_ping.py" <<< 7530 1727096058.80998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096058.81173: stderr chunk (state=3): >>><<< 7530 1727096058.81176: stdout chunk (state=3): >>><<< 7530 1727096058.81178: done transferring module to remote 7530 1727096058.81180: _low_level_execute_command(): starting 7530 1727096058.81182: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/ /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/AnsiballZ_ping.py && sleep 0' 7530 1727096058.81742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096058.81756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096058.81775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096058.81791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096058.81846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.81874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096058.81932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096058.81985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.83796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096058.83829: stderr chunk (state=3): >>><<< 7530 1727096058.83839: stdout chunk (state=3): >>><<< 7530 1727096058.83851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096058.83854: _low_level_execute_command(): starting 7530 1727096058.83859: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/AnsiballZ_ping.py && sleep 0' 7530 1727096058.84330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096058.84334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.84338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096058.84342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096058.84344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096058.84428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096058.84483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096058.99931: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7530 1727096059.01298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096059.01325: stderr chunk (state=3): >>><<< 7530 1727096059.01329: stdout chunk (state=3): >>><<< 7530 1727096059.01347: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096059.01371: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096059.01380: _low_level_execute_command(): starting 7530 1727096059.01385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096058.7450602-9334-66110697395933/ > /dev/null 2>&1 && sleep 0' 7530 1727096059.01825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096059.01831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096059.01859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.01863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096059.01865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.01929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096059.01932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096059.01939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096059.01975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096059.03843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096059.03872: stderr chunk (state=3): >>><<< 7530 1727096059.03876: stdout chunk (state=3): >>><<< 7530 1727096059.03890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096059.03898: handler run complete 7530 1727096059.03916: attempt loop complete, returning result 7530 1727096059.03918: _execute() done 7530 1727096059.03921: dumping result to json 7530 1727096059.03923: done dumping result, returning 7530 1727096059.03928: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-086b-f4f0-000000000120] 7530 1727096059.03932: sending task result for task 0afff68d-5257-086b-f4f0-000000000120 7530 1727096059.04022: done sending task result for task 0afff68d-5257-086b-f4f0-000000000120 7530 1727096059.04025: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7530 1727096059.04092: no more pending results, returning what we have 7530 1727096059.04096: results queue empty 7530 1727096059.04097: checking for any_errors_fatal 7530 1727096059.04103: done checking for any_errors_fatal 7530 1727096059.04104: checking for max_fail_percentage 7530 1727096059.04106: done checking for max_fail_percentage 7530 1727096059.04107: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.04108: done checking to see if all hosts have failed 7530 1727096059.04108: getting the remaining hosts for this loop 7530 1727096059.04110: done getting the remaining hosts for this loop 7530 1727096059.04113: getting the next task for host managed_node3 7530 1727096059.04123: done getting next task for host managed_node3 7530 1727096059.04125: ^ task is: TASK: meta (role_complete) 7530 1727096059.04129: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.04146: getting variables 7530 1727096059.04148: in VariableManager get_vars() 7530 1727096059.04199: Calling all_inventory to load vars for managed_node3 7530 1727096059.04202: Calling groups_inventory to load vars for managed_node3 7530 1727096059.04204: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.04215: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.04217: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.04220: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.10011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.10873: done with get_vars() 7530 1727096059.10902: done getting variables 7530 1727096059.10958: done queuing things up, now waiting for results queue to drain 7530 1727096059.10960: results queue empty 7530 1727096059.10961: checking for any_errors_fatal 7530 1727096059.10963: done checking for any_errors_fatal 7530 1727096059.10964: checking for max_fail_percentage 7530 1727096059.10964: done checking for max_fail_percentage 7530 1727096059.10965: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.10965: done checking to see if all hosts have failed 7530 1727096059.10966: getting the remaining hosts for this loop 7530 1727096059.10966: done getting the remaining hosts for this loop 7530 1727096059.10970: getting the next task for host managed_node3 7530 1727096059.10973: done getting next task for host managed_node3 7530 1727096059.10975: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7530 1727096059.10976: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.10978: getting variables 7530 1727096059.10978: in VariableManager get_vars() 7530 1727096059.10992: Calling all_inventory to load vars for managed_node3 7530 1727096059.10994: Calling groups_inventory to load vars for managed_node3 7530 1727096059.10995: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.10999: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.11000: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.11002: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.11638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.12491: done with get_vars() 7530 1727096059.12509: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:145 Monday 23 September 2024 08:54:19 -0400 (0:00:00.432) 0:00:49.914 ****** 7530 1727096059.12568: entering _queue_task() for managed_node3/include_tasks 7530 1727096059.12843: worker is 1 (out of 1 available) 7530 1727096059.12857: exiting _queue_task() for managed_node3/include_tasks 7530 1727096059.12871: done queuing things up, now waiting for results queue to drain 7530 1727096059.12874: waiting for pending results... 7530 1727096059.13057: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7530 1727096059.13137: in run() - task 0afff68d-5257-086b-f4f0-000000000150 7530 1727096059.13148: variable 'ansible_search_path' from source: unknown 7530 1727096059.13180: calling self._execute() 7530 1727096059.13267: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.13273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.13282: variable 'omit' from source: magic vars 7530 1727096059.13583: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.13593: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.13599: _execute() done 7530 1727096059.13602: dumping result to json 7530 1727096059.13605: done dumping result, returning 7530 1727096059.13613: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0afff68d-5257-086b-f4f0-000000000150] 7530 1727096059.13618: sending task result for task 0afff68d-5257-086b-f4f0-000000000150 7530 1727096059.13717: done sending task result for task 0afff68d-5257-086b-f4f0-000000000150 7530 1727096059.13719: WORKER PROCESS EXITING 7530 1727096059.13778: no more pending results, returning what we have 7530 1727096059.13783: in VariableManager get_vars() 7530 1727096059.13843: Calling all_inventory to load vars for managed_node3 7530 1727096059.13847: Calling groups_inventory to load vars for managed_node3 7530 1727096059.13849: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.13861: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.13864: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.13866: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.14798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.15662: done with get_vars() 7530 1727096059.15683: variable 'ansible_search_path' from source: unknown 7530 1727096059.15695: we have included files to process 7530 1727096059.15696: generating all_blocks data 7530 1727096059.15698: done generating all_blocks data 7530 1727096059.15703: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096059.15704: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096059.15706: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7530 1727096059.15976: in VariableManager get_vars() 7530 1727096059.15996: done with get_vars() 7530 1727096059.16430: done processing included file 7530 1727096059.16432: iterating over new_blocks loaded from include file 7530 1727096059.16433: in VariableManager get_vars() 7530 1727096059.16453: done with get_vars() 7530 1727096059.16454: filtering new block on tags 7530 1727096059.16480: done filtering new block on tags 7530 1727096059.16482: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7530 1727096059.16486: extending task lists for all hosts with included blocks 7530 1727096059.20286: done extending task lists 7530 1727096059.20288: done processing included files 7530 1727096059.20289: results queue empty 7530 1727096059.20289: checking for any_errors_fatal 7530 1727096059.20291: done checking for any_errors_fatal 7530 1727096059.20291: checking for max_fail_percentage 7530 1727096059.20292: done checking for max_fail_percentage 7530 1727096059.20292: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.20293: done checking to see if all hosts have failed 7530 1727096059.20294: getting the remaining hosts for this loop 7530 1727096059.20295: done getting the remaining hosts for this loop 7530 1727096059.20296: getting the next task for host managed_node3 7530 1727096059.20299: done getting next task for host managed_node3 7530 1727096059.20301: ^ task is: TASK: Ensure state in ["present", "absent"] 7530 1727096059.20303: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.20305: getting variables 7530 1727096059.20305: in VariableManager get_vars() 7530 1727096059.20323: Calling all_inventory to load vars for managed_node3 7530 1727096059.20324: Calling groups_inventory to load vars for managed_node3 7530 1727096059.20326: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.20331: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.20332: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.20334: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.21093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.22066: done with get_vars() 7530 1727096059.22095: done getting variables 7530 1727096059.22141: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Monday 23 September 2024 08:54:19 -0400 (0:00:00.096) 0:00:50.010 ****** 7530 1727096059.22176: entering _queue_task() for managed_node3/fail 7530 1727096059.22532: worker is 1 (out of 1 available) 7530 1727096059.22543: exiting _queue_task() for managed_node3/fail 7530 1727096059.22556: done queuing things up, now waiting for results queue to drain 7530 1727096059.22558: waiting for pending results... 7530 1727096059.23262: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7530 1727096059.23271: in run() - task 0afff68d-5257-086b-f4f0-000000001a6f 7530 1727096059.23288: variable 'ansible_search_path' from source: unknown 7530 1727096059.23357: variable 'ansible_search_path' from source: unknown 7530 1727096059.23361: calling self._execute() 7530 1727096059.23456: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.23476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.23492: variable 'omit' from source: magic vars 7530 1727096059.23892: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.23915: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.24064: variable 'state' from source: include params 7530 1727096059.24077: Evaluated conditional (state not in ["present", "absent"]): False 7530 1727096059.24087: when evaluation is False, skipping this task 7530 1727096059.24094: _execute() done 7530 1727096059.24117: dumping result to json 7530 1727096059.24120: done dumping result, returning 7530 1727096059.24123: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0afff68d-5257-086b-f4f0-000000001a6f] 7530 1727096059.24128: sending task result for task 0afff68d-5257-086b-f4f0-000000001a6f skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7530 1727096059.24349: no more pending results, returning what we have 7530 1727096059.24354: results queue empty 7530 1727096059.24355: checking for any_errors_fatal 7530 1727096059.24356: done checking for any_errors_fatal 7530 1727096059.24357: checking for max_fail_percentage 7530 1727096059.24359: done checking for max_fail_percentage 7530 1727096059.24360: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.24361: done checking to see if all hosts have failed 7530 1727096059.24361: getting the remaining hosts for this loop 7530 1727096059.24362: done getting the remaining hosts for this loop 7530 1727096059.24366: getting the next task for host managed_node3 7530 1727096059.24374: done getting next task for host managed_node3 7530 1727096059.24377: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096059.24381: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.24384: getting variables 7530 1727096059.24386: in VariableManager get_vars() 7530 1727096059.24439: Calling all_inventory to load vars for managed_node3 7530 1727096059.24442: Calling groups_inventory to load vars for managed_node3 7530 1727096059.24445: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.24459: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.24462: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.24466: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.24609: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a6f 7530 1727096059.24614: WORKER PROCESS EXITING 7530 1727096059.25970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.27644: done with get_vars() 7530 1727096059.27677: done getting variables 7530 1727096059.27743: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Monday 23 September 2024 08:54:19 -0400 (0:00:00.056) 0:00:50.066 ****** 7530 1727096059.27782: entering _queue_task() for managed_node3/fail 7530 1727096059.28167: worker is 1 (out of 1 available) 7530 1727096059.28326: exiting _queue_task() for managed_node3/fail 7530 1727096059.28341: done queuing things up, now waiting for results queue to drain 7530 1727096059.28343: waiting for pending results... 7530 1727096059.28552: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7530 1727096059.28678: in run() - task 0afff68d-5257-086b-f4f0-000000001a70 7530 1727096059.28756: variable 'ansible_search_path' from source: unknown 7530 1727096059.28760: variable 'ansible_search_path' from source: unknown 7530 1727096059.28764: calling self._execute() 7530 1727096059.28880: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.28899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.28917: variable 'omit' from source: magic vars 7530 1727096059.29329: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.29349: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.29501: variable 'type' from source: play vars 7530 1727096059.29516: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7530 1727096059.29541: when evaluation is False, skipping this task 7530 1727096059.29544: _execute() done 7530 1727096059.29546: dumping result to json 7530 1727096059.29548: done dumping result, returning 7530 1727096059.29625: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0afff68d-5257-086b-f4f0-000000001a70] 7530 1727096059.29628: sending task result for task 0afff68d-5257-086b-f4f0-000000001a70 7530 1727096059.29702: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a70 7530 1727096059.29705: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7530 1727096059.29770: no more pending results, returning what we have 7530 1727096059.29775: results queue empty 7530 1727096059.29777: checking for any_errors_fatal 7530 1727096059.29783: done checking for any_errors_fatal 7530 1727096059.29784: checking for max_fail_percentage 7530 1727096059.29786: done checking for max_fail_percentage 7530 1727096059.29787: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.29788: done checking to see if all hosts have failed 7530 1727096059.29789: getting the remaining hosts for this loop 7530 1727096059.29791: done getting the remaining hosts for this loop 7530 1727096059.29795: getting the next task for host managed_node3 7530 1727096059.29803: done getting next task for host managed_node3 7530 1727096059.29806: ^ task is: TASK: Include the task 'show_interfaces.yml' 7530 1727096059.29810: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.29816: getting variables 7530 1727096059.29818: in VariableManager get_vars() 7530 1727096059.30093: Calling all_inventory to load vars for managed_node3 7530 1727096059.30096: Calling groups_inventory to load vars for managed_node3 7530 1727096059.30099: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.30111: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.30115: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.30118: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.31726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.33348: done with get_vars() 7530 1727096059.33381: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Monday 23 September 2024 08:54:19 -0400 (0:00:00.057) 0:00:50.123 ****** 7530 1727096059.33490: entering _queue_task() for managed_node3/include_tasks 7530 1727096059.33959: worker is 1 (out of 1 available) 7530 1727096059.33974: exiting _queue_task() for managed_node3/include_tasks 7530 1727096059.34100: done queuing things up, now waiting for results queue to drain 7530 1727096059.34103: waiting for pending results... 7530 1727096059.34389: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7530 1727096059.34484: in run() - task 0afff68d-5257-086b-f4f0-000000001a71 7530 1727096059.34489: variable 'ansible_search_path' from source: unknown 7530 1727096059.34491: variable 'ansible_search_path' from source: unknown 7530 1727096059.34526: calling self._execute() 7530 1727096059.34658: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.34700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.34704: variable 'omit' from source: magic vars 7530 1727096059.35124: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.35149: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.35159: _execute() done 7530 1727096059.35191: dumping result to json 7530 1727096059.35196: done dumping result, returning 7530 1727096059.35198: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-086b-f4f0-000000001a71] 7530 1727096059.35201: sending task result for task 0afff68d-5257-086b-f4f0-000000001a71 7530 1727096059.35374: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a71 7530 1727096059.35378: WORKER PROCESS EXITING 7530 1727096059.35480: no more pending results, returning what we have 7530 1727096059.35485: in VariableManager get_vars() 7530 1727096059.35702: Calling all_inventory to load vars for managed_node3 7530 1727096059.35705: Calling groups_inventory to load vars for managed_node3 7530 1727096059.35708: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.35720: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.35723: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.35727: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.37395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.39873: done with get_vars() 7530 1727096059.39894: variable 'ansible_search_path' from source: unknown 7530 1727096059.39896: variable 'ansible_search_path' from source: unknown 7530 1727096059.39934: we have included files to process 7530 1727096059.39936: generating all_blocks data 7530 1727096059.39937: done generating all_blocks data 7530 1727096059.39943: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096059.39944: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096059.39946: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7530 1727096059.40070: in VariableManager get_vars() 7530 1727096059.40102: done with get_vars() 7530 1727096059.40224: done processing included file 7530 1727096059.40227: iterating over new_blocks loaded from include file 7530 1727096059.40228: in VariableManager get_vars() 7530 1727096059.40254: done with get_vars() 7530 1727096059.40256: filtering new block on tags 7530 1727096059.40282: done filtering new block on tags 7530 1727096059.40285: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7530 1727096059.40291: extending task lists for all hosts with included blocks 7530 1727096059.40724: done extending task lists 7530 1727096059.40726: done processing included files 7530 1727096059.40727: results queue empty 7530 1727096059.40728: checking for any_errors_fatal 7530 1727096059.40731: done checking for any_errors_fatal 7530 1727096059.40732: checking for max_fail_percentage 7530 1727096059.40733: done checking for max_fail_percentage 7530 1727096059.40734: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.40735: done checking to see if all hosts have failed 7530 1727096059.40736: getting the remaining hosts for this loop 7530 1727096059.40737: done getting the remaining hosts for this loop 7530 1727096059.40739: getting the next task for host managed_node3 7530 1727096059.40744: done getting next task for host managed_node3 7530 1727096059.40746: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7530 1727096059.40749: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.40751: getting variables 7530 1727096059.40752: in VariableManager get_vars() 7530 1727096059.40774: Calling all_inventory to load vars for managed_node3 7530 1727096059.40777: Calling groups_inventory to load vars for managed_node3 7530 1727096059.40779: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.40786: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.40788: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.40791: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.42011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.43613: done with get_vars() 7530 1727096059.43650: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:54:19 -0400 (0:00:00.102) 0:00:50.225 ****** 7530 1727096059.43760: entering _queue_task() for managed_node3/include_tasks 7530 1727096059.44285: worker is 1 (out of 1 available) 7530 1727096059.44296: exiting _queue_task() for managed_node3/include_tasks 7530 1727096059.44308: done queuing things up, now waiting for results queue to drain 7530 1727096059.44310: waiting for pending results... 7530 1727096059.44687: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7530 1727096059.44697: in run() - task 0afff68d-5257-086b-f4f0-000000001d1c 7530 1727096059.44702: variable 'ansible_search_path' from source: unknown 7530 1727096059.44706: variable 'ansible_search_path' from source: unknown 7530 1727096059.44743: calling self._execute() 7530 1727096059.44848: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.44854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.44865: variable 'omit' from source: magic vars 7530 1727096059.45413: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.45438: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.45442: _execute() done 7530 1727096059.45446: dumping result to json 7530 1727096059.45448: done dumping result, returning 7530 1727096059.45573: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-086b-f4f0-000000001d1c] 7530 1727096059.45576: sending task result for task 0afff68d-5257-086b-f4f0-000000001d1c 7530 1727096059.45645: done sending task result for task 0afff68d-5257-086b-f4f0-000000001d1c 7530 1727096059.45648: WORKER PROCESS EXITING 7530 1727096059.45814: no more pending results, returning what we have 7530 1727096059.45819: in VariableManager get_vars() 7530 1727096059.45871: Calling all_inventory to load vars for managed_node3 7530 1727096059.45874: Calling groups_inventory to load vars for managed_node3 7530 1727096059.45876: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.45888: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.45891: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.45899: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.47616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.49212: done with get_vars() 7530 1727096059.49245: variable 'ansible_search_path' from source: unknown 7530 1727096059.49246: variable 'ansible_search_path' from source: unknown 7530 1727096059.49312: we have included files to process 7530 1727096059.49314: generating all_blocks data 7530 1727096059.49315: done generating all_blocks data 7530 1727096059.49316: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096059.49318: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096059.49320: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7530 1727096059.49592: done processing included file 7530 1727096059.49594: iterating over new_blocks loaded from include file 7530 1727096059.49596: in VariableManager get_vars() 7530 1727096059.49625: done with get_vars() 7530 1727096059.49627: filtering new block on tags 7530 1727096059.49646: done filtering new block on tags 7530 1727096059.49648: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7530 1727096059.49653: extending task lists for all hosts with included blocks 7530 1727096059.49793: done extending task lists 7530 1727096059.49795: done processing included files 7530 1727096059.49796: results queue empty 7530 1727096059.49796: checking for any_errors_fatal 7530 1727096059.49799: done checking for any_errors_fatal 7530 1727096059.49800: checking for max_fail_percentage 7530 1727096059.49801: done checking for max_fail_percentage 7530 1727096059.49801: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.49802: done checking to see if all hosts have failed 7530 1727096059.49803: getting the remaining hosts for this loop 7530 1727096059.49804: done getting the remaining hosts for this loop 7530 1727096059.49806: getting the next task for host managed_node3 7530 1727096059.49810: done getting next task for host managed_node3 7530 1727096059.49813: ^ task is: TASK: Gather current interface info 7530 1727096059.49816: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.49819: getting variables 7530 1727096059.49820: in VariableManager get_vars() 7530 1727096059.49838: Calling all_inventory to load vars for managed_node3 7530 1727096059.49840: Calling groups_inventory to load vars for managed_node3 7530 1727096059.49841: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.49847: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.49849: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.49851: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.51143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.52792: done with get_vars() 7530 1727096059.52831: done getting variables 7530 1727096059.52887: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:54:19 -0400 (0:00:00.091) 0:00:50.317 ****** 7530 1727096059.52926: entering _queue_task() for managed_node3/command 7530 1727096059.53325: worker is 1 (out of 1 available) 7530 1727096059.53338: exiting _queue_task() for managed_node3/command 7530 1727096059.53351: done queuing things up, now waiting for results queue to drain 7530 1727096059.53353: waiting for pending results... 7530 1727096059.53672: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7530 1727096059.53823: in run() - task 0afff68d-5257-086b-f4f0-000000001d53 7530 1727096059.53828: variable 'ansible_search_path' from source: unknown 7530 1727096059.53832: variable 'ansible_search_path' from source: unknown 7530 1727096059.53843: calling self._execute() 7530 1727096059.54039: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.54044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.54047: variable 'omit' from source: magic vars 7530 1727096059.54334: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.54346: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.54352: variable 'omit' from source: magic vars 7530 1727096059.54404: variable 'omit' from source: magic vars 7530 1727096059.54442: variable 'omit' from source: magic vars 7530 1727096059.54573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096059.54578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096059.54580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096059.54583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096059.54586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096059.54609: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096059.54613: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.54615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.54721: Set connection var ansible_pipelining to False 7530 1727096059.54728: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096059.54733: Set connection var ansible_timeout to 10 7530 1727096059.54743: Set connection var ansible_shell_executable to /bin/sh 7530 1727096059.54746: Set connection var ansible_shell_type to sh 7530 1727096059.54748: Set connection var ansible_connection to ssh 7530 1727096059.54778: variable 'ansible_shell_executable' from source: unknown 7530 1727096059.54782: variable 'ansible_connection' from source: unknown 7530 1727096059.54784: variable 'ansible_module_compression' from source: unknown 7530 1727096059.54787: variable 'ansible_shell_type' from source: unknown 7530 1727096059.54790: variable 'ansible_shell_executable' from source: unknown 7530 1727096059.54792: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.54795: variable 'ansible_pipelining' from source: unknown 7530 1727096059.54797: variable 'ansible_timeout' from source: unknown 7530 1727096059.54799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.55175: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096059.55180: variable 'omit' from source: magic vars 7530 1727096059.55182: starting attempt loop 7530 1727096059.55184: running the handler 7530 1727096059.55187: _low_level_execute_command(): starting 7530 1727096059.55189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096059.55787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096059.55813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096059.55825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096059.55895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096059.57635: stdout chunk (state=3): >>>/root <<< 7530 1727096059.57780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096059.57816: stdout chunk (state=3): >>><<< 7530 1727096059.57820: stderr chunk (state=3): >>><<< 7530 1727096059.57953: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096059.57957: _low_level_execute_command(): starting 7530 1727096059.57960: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922 `" && echo ansible-tmp-1727096059.5784686-9364-240949585611922="` echo /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922 `" ) && sleep 0' 7530 1727096059.58661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096059.58685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096059.58756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.58891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096059.58898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096059.58939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096059.61007: stdout chunk (state=3): >>>ansible-tmp-1727096059.5784686-9364-240949585611922=/root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922 <<< 7530 1727096059.61207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096059.61211: stdout chunk (state=3): >>><<< 7530 1727096059.61213: stderr chunk (state=3): >>><<< 7530 1727096059.61374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096059.5784686-9364-240949585611922=/root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096059.61377: variable 'ansible_module_compression' from source: unknown 7530 1727096059.61380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096059.61382: variable 'ansible_facts' from source: unknown 7530 1727096059.61515: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/AnsiballZ_command.py 7530 1727096059.62071: Sending initial data 7530 1727096059.62076: Sent initial data (154 bytes) 7530 1727096059.62674: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096059.62691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096059.62702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.62766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096059.62790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096059.62866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096059.64549: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096059.64605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096059.64676: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyp5esstc /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/AnsiballZ_command.py <<< 7530 1727096059.64680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/AnsiballZ_command.py" <<< 7530 1727096059.64683: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpyp5esstc" to remote "/root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/AnsiballZ_command.py" <<< 7530 1727096059.65475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096059.65479: stderr chunk (state=3): >>><<< 7530 1727096059.65482: stdout chunk (state=3): >>><<< 7530 1727096059.65484: done transferring module to remote 7530 1727096059.65486: _low_level_execute_command(): starting 7530 1727096059.65488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/ /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/AnsiballZ_command.py && sleep 0' 7530 1727096059.66085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096059.66092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096059.66103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096059.66124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096059.66139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096059.66142: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096059.66152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.66165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7530 1727096059.66175: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 7530 1727096059.66241: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.66264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096059.66281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096059.66299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096059.66363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096059.68375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096059.68379: stderr chunk (state=3): >>><<< 7530 1727096059.68382: stdout chunk (state=3): >>><<< 7530 1727096059.68385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096059.68387: _low_level_execute_command(): starting 7530 1727096059.68390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/AnsiballZ_command.py && sleep 0' 7530 1727096059.68983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096059.68999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096059.69009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096059.69024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096059.69083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.69130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096059.69144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096059.69161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096059.69241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096059.85613: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:54:19.850014", "end": "2024-09-23 08:54:19.853529", "delta": "0:00:00.003515", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096059.87339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096059.87351: stderr chunk (state=3): >>><<< 7530 1727096059.87373: stdout chunk (state=3): >>><<< 7530 1727096059.87412: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:54:19.850014", "end": "2024-09-23 08:54:19.853529", "delta": "0:00:00.003515", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096059.87443: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096059.87455: _low_level_execute_command(): starting 7530 1727096059.87458: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096059.5784686-9364-240949585611922/ > /dev/null 2>&1 && sleep 0' 7530 1727096059.88081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096059.88127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096059.88137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096059.88162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096059.88192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096059.88264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096059.90117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096059.90142: stderr chunk (state=3): >>><<< 7530 1727096059.90145: stdout chunk (state=3): >>><<< 7530 1727096059.90159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096059.90165: handler run complete 7530 1727096059.90188: Evaluated conditional (False): False 7530 1727096059.90197: attempt loop complete, returning result 7530 1727096059.90200: _execute() done 7530 1727096059.90202: dumping result to json 7530 1727096059.90206: done dumping result, returning 7530 1727096059.90214: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0afff68d-5257-086b-f4f0-000000001d53] 7530 1727096059.90218: sending task result for task 0afff68d-5257-086b-f4f0-000000001d53 7530 1727096059.90318: done sending task result for task 0afff68d-5257-086b-f4f0-000000001d53 7530 1727096059.90321: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003515", "end": "2024-09-23 08:54:19.853529", "rc": 0, "start": "2024-09-23 08:54:19.850014" } STDOUT: eth0 lo peerveth0 veth0 7530 1727096059.90399: no more pending results, returning what we have 7530 1727096059.90403: results queue empty 7530 1727096059.90404: checking for any_errors_fatal 7530 1727096059.90405: done checking for any_errors_fatal 7530 1727096059.90406: checking for max_fail_percentage 7530 1727096059.90407: done checking for max_fail_percentage 7530 1727096059.90408: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.90409: done checking to see if all hosts have failed 7530 1727096059.90410: getting the remaining hosts for this loop 7530 1727096059.90412: done getting the remaining hosts for this loop 7530 1727096059.90415: getting the next task for host managed_node3 7530 1727096059.90423: done getting next task for host managed_node3 7530 1727096059.90425: ^ task is: TASK: Set current_interfaces 7530 1727096059.90433: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.90439: getting variables 7530 1727096059.90441: in VariableManager get_vars() 7530 1727096059.90490: Calling all_inventory to load vars for managed_node3 7530 1727096059.90493: Calling groups_inventory to load vars for managed_node3 7530 1727096059.90495: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.90506: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.90508: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.90511: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.91973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.92878: done with get_vars() 7530 1727096059.92904: done getting variables 7530 1727096059.92954: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:54:19 -0400 (0:00:00.400) 0:00:50.718 ****** 7530 1727096059.92986: entering _queue_task() for managed_node3/set_fact 7530 1727096059.93252: worker is 1 (out of 1 available) 7530 1727096059.93266: exiting _queue_task() for managed_node3/set_fact 7530 1727096059.93282: done queuing things up, now waiting for results queue to drain 7530 1727096059.93284: waiting for pending results... 7530 1727096059.93464: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7530 1727096059.93551: in run() - task 0afff68d-5257-086b-f4f0-000000001d54 7530 1727096059.93563: variable 'ansible_search_path' from source: unknown 7530 1727096059.93569: variable 'ansible_search_path' from source: unknown 7530 1727096059.93596: calling self._execute() 7530 1727096059.93683: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.93687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.93697: variable 'omit' from source: magic vars 7530 1727096059.93975: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.93986: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.93991: variable 'omit' from source: magic vars 7530 1727096059.94025: variable 'omit' from source: magic vars 7530 1727096059.94106: variable '_current_interfaces' from source: set_fact 7530 1727096059.94159: variable 'omit' from source: magic vars 7530 1727096059.94199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096059.94226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096059.94243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096059.94256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096059.94265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096059.94293: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096059.94296: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.94299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.94365: Set connection var ansible_pipelining to False 7530 1727096059.94372: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096059.94379: Set connection var ansible_timeout to 10 7530 1727096059.94391: Set connection var ansible_shell_executable to /bin/sh 7530 1727096059.94394: Set connection var ansible_shell_type to sh 7530 1727096059.94396: Set connection var ansible_connection to ssh 7530 1727096059.94413: variable 'ansible_shell_executable' from source: unknown 7530 1727096059.94416: variable 'ansible_connection' from source: unknown 7530 1727096059.94419: variable 'ansible_module_compression' from source: unknown 7530 1727096059.94421: variable 'ansible_shell_type' from source: unknown 7530 1727096059.94423: variable 'ansible_shell_executable' from source: unknown 7530 1727096059.94426: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.94428: variable 'ansible_pipelining' from source: unknown 7530 1727096059.94431: variable 'ansible_timeout' from source: unknown 7530 1727096059.94438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.94541: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096059.94548: variable 'omit' from source: magic vars 7530 1727096059.94553: starting attempt loop 7530 1727096059.94556: running the handler 7530 1727096059.94566: handler run complete 7530 1727096059.94576: attempt loop complete, returning result 7530 1727096059.94578: _execute() done 7530 1727096059.94581: dumping result to json 7530 1727096059.94583: done dumping result, returning 7530 1727096059.94590: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0afff68d-5257-086b-f4f0-000000001d54] 7530 1727096059.94594: sending task result for task 0afff68d-5257-086b-f4f0-000000001d54 7530 1727096059.94680: done sending task result for task 0afff68d-5257-086b-f4f0-000000001d54 7530 1727096059.94683: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7530 1727096059.94772: no more pending results, returning what we have 7530 1727096059.94775: results queue empty 7530 1727096059.94776: checking for any_errors_fatal 7530 1727096059.94783: done checking for any_errors_fatal 7530 1727096059.94784: checking for max_fail_percentage 7530 1727096059.94785: done checking for max_fail_percentage 7530 1727096059.94786: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.94787: done checking to see if all hosts have failed 7530 1727096059.94788: getting the remaining hosts for this loop 7530 1727096059.94789: done getting the remaining hosts for this loop 7530 1727096059.94793: getting the next task for host managed_node3 7530 1727096059.94800: done getting next task for host managed_node3 7530 1727096059.94803: ^ task is: TASK: Show current_interfaces 7530 1727096059.94806: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.94810: getting variables 7530 1727096059.94811: in VariableManager get_vars() 7530 1727096059.94858: Calling all_inventory to load vars for managed_node3 7530 1727096059.94861: Calling groups_inventory to load vars for managed_node3 7530 1727096059.94863: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.94879: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.94882: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.94887: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.95683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096059.96673: done with get_vars() 7530 1727096059.96690: done getting variables 7530 1727096059.96737: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:54:19 -0400 (0:00:00.037) 0:00:50.755 ****** 7530 1727096059.96761: entering _queue_task() for managed_node3/debug 7530 1727096059.97015: worker is 1 (out of 1 available) 7530 1727096059.97029: exiting _queue_task() for managed_node3/debug 7530 1727096059.97044: done queuing things up, now waiting for results queue to drain 7530 1727096059.97046: waiting for pending results... 7530 1727096059.97223: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7530 1727096059.97298: in run() - task 0afff68d-5257-086b-f4f0-000000001d1d 7530 1727096059.97310: variable 'ansible_search_path' from source: unknown 7530 1727096059.97312: variable 'ansible_search_path' from source: unknown 7530 1727096059.97342: calling self._execute() 7530 1727096059.97423: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.97428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.97439: variable 'omit' from source: magic vars 7530 1727096059.97717: variable 'ansible_distribution_major_version' from source: facts 7530 1727096059.97729: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096059.97734: variable 'omit' from source: magic vars 7530 1727096059.97763: variable 'omit' from source: magic vars 7530 1727096059.97834: variable 'current_interfaces' from source: set_fact 7530 1727096059.97857: variable 'omit' from source: magic vars 7530 1727096059.97892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096059.97920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096059.97940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096059.97952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096059.97961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096059.97986: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096059.97989: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.97992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.98063: Set connection var ansible_pipelining to False 7530 1727096059.98069: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096059.98075: Set connection var ansible_timeout to 10 7530 1727096059.98082: Set connection var ansible_shell_executable to /bin/sh 7530 1727096059.98085: Set connection var ansible_shell_type to sh 7530 1727096059.98088: Set connection var ansible_connection to ssh 7530 1727096059.98106: variable 'ansible_shell_executable' from source: unknown 7530 1727096059.98109: variable 'ansible_connection' from source: unknown 7530 1727096059.98112: variable 'ansible_module_compression' from source: unknown 7530 1727096059.98114: variable 'ansible_shell_type' from source: unknown 7530 1727096059.98117: variable 'ansible_shell_executable' from source: unknown 7530 1727096059.98119: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096059.98121: variable 'ansible_pipelining' from source: unknown 7530 1727096059.98124: variable 'ansible_timeout' from source: unknown 7530 1727096059.98126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096059.98228: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096059.98240: variable 'omit' from source: magic vars 7530 1727096059.98243: starting attempt loop 7530 1727096059.98246: running the handler 7530 1727096059.98286: handler run complete 7530 1727096059.98296: attempt loop complete, returning result 7530 1727096059.98300: _execute() done 7530 1727096059.98303: dumping result to json 7530 1727096059.98305: done dumping result, returning 7530 1727096059.98310: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0afff68d-5257-086b-f4f0-000000001d1d] 7530 1727096059.98315: sending task result for task 0afff68d-5257-086b-f4f0-000000001d1d 7530 1727096059.98396: done sending task result for task 0afff68d-5257-086b-f4f0-000000001d1d 7530 1727096059.98399: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7530 1727096059.98447: no more pending results, returning what we have 7530 1727096059.98451: results queue empty 7530 1727096059.98452: checking for any_errors_fatal 7530 1727096059.98458: done checking for any_errors_fatal 7530 1727096059.98458: checking for max_fail_percentage 7530 1727096059.98460: done checking for max_fail_percentage 7530 1727096059.98461: checking to see if all hosts have failed and the running result is not ok 7530 1727096059.98462: done checking to see if all hosts have failed 7530 1727096059.98463: getting the remaining hosts for this loop 7530 1727096059.98464: done getting the remaining hosts for this loop 7530 1727096059.98469: getting the next task for host managed_node3 7530 1727096059.98477: done getting next task for host managed_node3 7530 1727096059.98480: ^ task is: TASK: Install iproute 7530 1727096059.98483: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096059.98487: getting variables 7530 1727096059.98489: in VariableManager get_vars() 7530 1727096059.98539: Calling all_inventory to load vars for managed_node3 7530 1727096059.98542: Calling groups_inventory to load vars for managed_node3 7530 1727096059.98544: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096059.98555: Calling all_plugins_play to load vars for managed_node3 7530 1727096059.98558: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096059.98560: Calling groups_plugins_play to load vars for managed_node3 7530 1727096059.99354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096060.00216: done with get_vars() 7530 1727096060.00234: done getting variables 7530 1727096060.00280: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Monday 23 September 2024 08:54:20 -0400 (0:00:00.035) 0:00:50.791 ****** 7530 1727096060.00303: entering _queue_task() for managed_node3/package 7530 1727096060.00552: worker is 1 (out of 1 available) 7530 1727096060.00566: exiting _queue_task() for managed_node3/package 7530 1727096060.00580: done queuing things up, now waiting for results queue to drain 7530 1727096060.00582: waiting for pending results... 7530 1727096060.00757: running TaskExecutor() for managed_node3/TASK: Install iproute 7530 1727096060.00828: in run() - task 0afff68d-5257-086b-f4f0-000000001a72 7530 1727096060.00842: variable 'ansible_search_path' from source: unknown 7530 1727096060.00846: variable 'ansible_search_path' from source: unknown 7530 1727096060.00876: calling self._execute() 7530 1727096060.00963: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.00969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.00981: variable 'omit' from source: magic vars 7530 1727096060.01265: variable 'ansible_distribution_major_version' from source: facts 7530 1727096060.01278: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096060.01285: variable 'omit' from source: magic vars 7530 1727096060.01309: variable 'omit' from source: magic vars 7530 1727096060.01451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7530 1727096060.02989: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7530 1727096060.03037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7530 1727096060.03069: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7530 1727096060.03094: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7530 1727096060.03118: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7530 1727096060.03192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7530 1727096060.03225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7530 1727096060.03245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7530 1727096060.03271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7530 1727096060.03283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7530 1727096060.03364: variable '__network_is_ostree' from source: set_fact 7530 1727096060.03369: variable 'omit' from source: magic vars 7530 1727096060.03394: variable 'omit' from source: magic vars 7530 1727096060.03419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096060.03446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096060.03461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096060.03480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096060.03489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096060.03512: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096060.03515: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.03517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.03591: Set connection var ansible_pipelining to False 7530 1727096060.03595: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096060.03601: Set connection var ansible_timeout to 10 7530 1727096060.03609: Set connection var ansible_shell_executable to /bin/sh 7530 1727096060.03611: Set connection var ansible_shell_type to sh 7530 1727096060.03614: Set connection var ansible_connection to ssh 7530 1727096060.03634: variable 'ansible_shell_executable' from source: unknown 7530 1727096060.03637: variable 'ansible_connection' from source: unknown 7530 1727096060.03644: variable 'ansible_module_compression' from source: unknown 7530 1727096060.03646: variable 'ansible_shell_type' from source: unknown 7530 1727096060.03649: variable 'ansible_shell_executable' from source: unknown 7530 1727096060.03651: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.03653: variable 'ansible_pipelining' from source: unknown 7530 1727096060.03655: variable 'ansible_timeout' from source: unknown 7530 1727096060.03663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.03732: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096060.03744: variable 'omit' from source: magic vars 7530 1727096060.03750: starting attempt loop 7530 1727096060.03752: running the handler 7530 1727096060.03759: variable 'ansible_facts' from source: unknown 7530 1727096060.03762: variable 'ansible_facts' from source: unknown 7530 1727096060.03795: _low_level_execute_command(): starting 7530 1727096060.03800: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096060.04313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096060.04318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.04321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.04323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.04377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.04380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.04383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.04430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.06131: stdout chunk (state=3): >>>/root <<< 7530 1727096060.06219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.06255: stderr chunk (state=3): >>><<< 7530 1727096060.06259: stdout chunk (state=3): >>><<< 7530 1727096060.06284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096060.06296: _low_level_execute_command(): starting 7530 1727096060.06302: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787 `" && echo ansible-tmp-1727096060.0628352-9382-229494538993787="` echo /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787 `" ) && sleep 0' 7530 1727096060.06766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096060.06772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096060.06774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096060.06780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.06783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.06855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.06857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.06892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.08869: stdout chunk (state=3): >>>ansible-tmp-1727096060.0628352-9382-229494538993787=/root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787 <<< 7530 1727096060.08960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.08993: stderr chunk (state=3): >>><<< 7530 1727096060.08996: stdout chunk (state=3): >>><<< 7530 1727096060.09015: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096060.0628352-9382-229494538993787=/root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096060.09052: variable 'ansible_module_compression' from source: unknown 7530 1727096060.09103: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7530 1727096060.09138: variable 'ansible_facts' from source: unknown 7530 1727096060.09225: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/AnsiballZ_dnf.py 7530 1727096060.09341: Sending initial data 7530 1727096060.09345: Sent initial data (150 bytes) 7530 1727096060.09806: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.09810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.09813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096060.09815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.09876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.09879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.09889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.09924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.11641: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096060.11675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096060.11721: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpai6toob8 /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/AnsiballZ_dnf.py <<< 7530 1727096060.11724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/AnsiballZ_dnf.py" <<< 7530 1727096060.11773: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpai6toob8" to remote "/root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/AnsiballZ_dnf.py" <<< 7530 1727096060.12781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.12835: stderr chunk (state=3): >>><<< 7530 1727096060.12963: stdout chunk (state=3): >>><<< 7530 1727096060.12966: done transferring module to remote 7530 1727096060.12970: _low_level_execute_command(): starting 7530 1727096060.12973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/ /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/AnsiballZ_dnf.py && sleep 0' 7530 1727096060.13874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096060.13894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096060.13920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.13940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096060.14034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.14053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.14071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.14092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.14265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.16178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.16231: stderr chunk (state=3): >>><<< 7530 1727096060.16244: stdout chunk (state=3): >>><<< 7530 1727096060.16265: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096060.16276: _low_level_execute_command(): starting 7530 1727096060.16362: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/AnsiballZ_dnf.py && sleep 0' 7530 1727096060.17408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096060.17641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.17838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.17979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.18133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.18179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.61489: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7530 1727096060.66444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.66461: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 7530 1727096060.66585: stderr chunk (state=3): >>><<< 7530 1727096060.66596: stdout chunk (state=3): >>><<< 7530 1727096060.66625: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096060.66877: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096060.66888: _low_level_execute_command(): starting 7530 1727096060.66891: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096060.0628352-9382-229494538993787/ > /dev/null 2>&1 && sleep 0' 7530 1727096060.67956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096060.68014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096060.68032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.68092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096060.68105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096060.68185: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.68221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.68237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.68257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.68318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.70386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.70391: stdout chunk (state=3): >>><<< 7530 1727096060.70394: stderr chunk (state=3): >>><<< 7530 1727096060.70427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096060.70479: handler run complete 7530 1727096060.70630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7530 1727096060.70826: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7530 1727096060.70875: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7530 1727096060.70913: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7530 1727096060.71574: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7530 1727096060.71577: variable '__install_status' from source: set_fact 7530 1727096060.71580: Evaluated conditional (__install_status is success): True 7530 1727096060.71581: attempt loop complete, returning result 7530 1727096060.71583: _execute() done 7530 1727096060.71585: dumping result to json 7530 1727096060.71587: done dumping result, returning 7530 1727096060.71588: done running TaskExecutor() for managed_node3/TASK: Install iproute [0afff68d-5257-086b-f4f0-000000001a72] 7530 1727096060.71590: sending task result for task 0afff68d-5257-086b-f4f0-000000001a72 7530 1727096060.71672: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a72 7530 1727096060.71675: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7530 1727096060.71772: no more pending results, returning what we have 7530 1727096060.71775: results queue empty 7530 1727096060.71776: checking for any_errors_fatal 7530 1727096060.71786: done checking for any_errors_fatal 7530 1727096060.71787: checking for max_fail_percentage 7530 1727096060.71789: done checking for max_fail_percentage 7530 1727096060.71790: checking to see if all hosts have failed and the running result is not ok 7530 1727096060.71791: done checking to see if all hosts have failed 7530 1727096060.71791: getting the remaining hosts for this loop 7530 1727096060.71793: done getting the remaining hosts for this loop 7530 1727096060.71796: getting the next task for host managed_node3 7530 1727096060.71803: done getting next task for host managed_node3 7530 1727096060.71805: ^ task is: TASK: Create veth interface {{ interface }} 7530 1727096060.71808: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096060.71813: getting variables 7530 1727096060.71815: in VariableManager get_vars() 7530 1727096060.71865: Calling all_inventory to load vars for managed_node3 7530 1727096060.71870: Calling groups_inventory to load vars for managed_node3 7530 1727096060.71872: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096060.71885: Calling all_plugins_play to load vars for managed_node3 7530 1727096060.71891: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096060.71895: Calling groups_plugins_play to load vars for managed_node3 7530 1727096060.73421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096060.74712: done with get_vars() 7530 1727096060.74743: done getting variables 7530 1727096060.74804: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096060.74926: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Monday 23 September 2024 08:54:20 -0400 (0:00:00.746) 0:00:51.537 ****** 7530 1727096060.74960: entering _queue_task() for managed_node3/command 7530 1727096060.75327: worker is 1 (out of 1 available) 7530 1727096060.75342: exiting _queue_task() for managed_node3/command 7530 1727096060.75355: done queuing things up, now waiting for results queue to drain 7530 1727096060.75357: waiting for pending results... 7530 1727096060.75685: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7530 1727096060.75800: in run() - task 0afff68d-5257-086b-f4f0-000000001a73 7530 1727096060.75828: variable 'ansible_search_path' from source: unknown 7530 1727096060.75836: variable 'ansible_search_path' from source: unknown 7530 1727096060.76132: variable 'interface' from source: play vars 7530 1727096060.76233: variable 'interface' from source: play vars 7530 1727096060.76317: variable 'interface' from source: play vars 7530 1727096060.76512: Loaded config def from plugin (lookup/items) 7530 1727096060.76527: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7530 1727096060.76569: variable 'omit' from source: magic vars 7530 1727096060.76723: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.76774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.76781: variable 'omit' from source: magic vars 7530 1727096060.77018: variable 'ansible_distribution_major_version' from source: facts 7530 1727096060.77045: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096060.77297: variable 'type' from source: play vars 7530 1727096060.77301: variable 'state' from source: include params 7530 1727096060.77304: variable 'interface' from source: play vars 7530 1727096060.77306: variable 'current_interfaces' from source: set_fact 7530 1727096060.77313: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7530 1727096060.77316: when evaluation is False, skipping this task 7530 1727096060.77345: variable 'item' from source: unknown 7530 1727096060.77407: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7530 1727096060.77579: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.77582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.77585: variable 'omit' from source: magic vars 7530 1727096060.77631: variable 'ansible_distribution_major_version' from source: facts 7530 1727096060.77635: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096060.77759: variable 'type' from source: play vars 7530 1727096060.77762: variable 'state' from source: include params 7530 1727096060.77765: variable 'interface' from source: play vars 7530 1727096060.77769: variable 'current_interfaces' from source: set_fact 7530 1727096060.77777: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7530 1727096060.77780: when evaluation is False, skipping this task 7530 1727096060.77799: variable 'item' from source: unknown 7530 1727096060.77845: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7530 1727096060.77924: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.77927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.77930: variable 'omit' from source: magic vars 7530 1727096060.78273: variable 'ansible_distribution_major_version' from source: facts 7530 1727096060.78276: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096060.78279: variable 'type' from source: play vars 7530 1727096060.78281: variable 'state' from source: include params 7530 1727096060.78283: variable 'interface' from source: play vars 7530 1727096060.78284: variable 'current_interfaces' from source: set_fact 7530 1727096060.78288: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7530 1727096060.78289: when evaluation is False, skipping this task 7530 1727096060.78291: variable 'item' from source: unknown 7530 1727096060.78334: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7530 1727096060.78432: dumping result to json 7530 1727096060.78442: done dumping result, returning 7530 1727096060.78453: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0afff68d-5257-086b-f4f0-000000001a73] 7530 1727096060.78462: sending task result for task 0afff68d-5257-086b-f4f0-000000001a73 7530 1727096060.78534: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a73 skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7530 1727096060.78688: no more pending results, returning what we have 7530 1727096060.78692: results queue empty 7530 1727096060.78693: checking for any_errors_fatal 7530 1727096060.78703: done checking for any_errors_fatal 7530 1727096060.78704: checking for max_fail_percentage 7530 1727096060.78705: done checking for max_fail_percentage 7530 1727096060.78706: checking to see if all hosts have failed and the running result is not ok 7530 1727096060.78707: done checking to see if all hosts have failed 7530 1727096060.78708: getting the remaining hosts for this loop 7530 1727096060.78710: done getting the remaining hosts for this loop 7530 1727096060.78713: getting the next task for host managed_node3 7530 1727096060.78718: done getting next task for host managed_node3 7530 1727096060.78720: ^ task is: TASK: Set up veth as managed by NetworkManager 7530 1727096060.78723: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096060.78727: getting variables 7530 1727096060.78729: in VariableManager get_vars() 7530 1727096060.78781: Calling all_inventory to load vars for managed_node3 7530 1727096060.78784: Calling groups_inventory to load vars for managed_node3 7530 1727096060.78786: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096060.78798: Calling all_plugins_play to load vars for managed_node3 7530 1727096060.78801: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096060.78804: Calling groups_plugins_play to load vars for managed_node3 7530 1727096060.79325: WORKER PROCESS EXITING 7530 1727096060.80292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096060.81635: done with get_vars() 7530 1727096060.81664: done getting variables 7530 1727096060.81715: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Monday 23 September 2024 08:54:20 -0400 (0:00:00.067) 0:00:51.605 ****** 7530 1727096060.81746: entering _queue_task() for managed_node3/command 7530 1727096060.82016: worker is 1 (out of 1 available) 7530 1727096060.82031: exiting _queue_task() for managed_node3/command 7530 1727096060.82045: done queuing things up, now waiting for results queue to drain 7530 1727096060.82047: waiting for pending results... 7530 1727096060.82232: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7530 1727096060.82316: in run() - task 0afff68d-5257-086b-f4f0-000000001a74 7530 1727096060.82329: variable 'ansible_search_path' from source: unknown 7530 1727096060.82338: variable 'ansible_search_path' from source: unknown 7530 1727096060.82363: calling self._execute() 7530 1727096060.82451: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.82456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.82466: variable 'omit' from source: magic vars 7530 1727096060.82747: variable 'ansible_distribution_major_version' from source: facts 7530 1727096060.82759: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096060.82864: variable 'type' from source: play vars 7530 1727096060.82870: variable 'state' from source: include params 7530 1727096060.82873: Evaluated conditional (type == 'veth' and state == 'present'): False 7530 1727096060.82876: when evaluation is False, skipping this task 7530 1727096060.82880: _execute() done 7530 1727096060.82882: dumping result to json 7530 1727096060.82885: done dumping result, returning 7530 1727096060.82892: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0afff68d-5257-086b-f4f0-000000001a74] 7530 1727096060.82896: sending task result for task 0afff68d-5257-086b-f4f0-000000001a74 7530 1727096060.82987: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a74 7530 1727096060.82990: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7530 1727096060.83049: no more pending results, returning what we have 7530 1727096060.83053: results queue empty 7530 1727096060.83054: checking for any_errors_fatal 7530 1727096060.83069: done checking for any_errors_fatal 7530 1727096060.83069: checking for max_fail_percentage 7530 1727096060.83071: done checking for max_fail_percentage 7530 1727096060.83072: checking to see if all hosts have failed and the running result is not ok 7530 1727096060.83073: done checking to see if all hosts have failed 7530 1727096060.83074: getting the remaining hosts for this loop 7530 1727096060.83076: done getting the remaining hosts for this loop 7530 1727096060.83080: getting the next task for host managed_node3 7530 1727096060.83086: done getting next task for host managed_node3 7530 1727096060.83088: ^ task is: TASK: Delete veth interface {{ interface }} 7530 1727096060.83091: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096060.83095: getting variables 7530 1727096060.83097: in VariableManager get_vars() 7530 1727096060.83150: Calling all_inventory to load vars for managed_node3 7530 1727096060.83153: Calling groups_inventory to load vars for managed_node3 7530 1727096060.83155: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096060.83205: Calling all_plugins_play to load vars for managed_node3 7530 1727096060.83210: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096060.83214: Calling groups_plugins_play to load vars for managed_node3 7530 1727096060.84740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096060.85620: done with get_vars() 7530 1727096060.85650: done getting variables 7530 1727096060.85703: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096060.85795: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Monday 23 September 2024 08:54:20 -0400 (0:00:00.040) 0:00:51.646 ****** 7530 1727096060.85822: entering _queue_task() for managed_node3/command 7530 1727096060.86102: worker is 1 (out of 1 available) 7530 1727096060.86115: exiting _queue_task() for managed_node3/command 7530 1727096060.86130: done queuing things up, now waiting for results queue to drain 7530 1727096060.86132: waiting for pending results... 7530 1727096060.86323: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7530 1727096060.86408: in run() - task 0afff68d-5257-086b-f4f0-000000001a75 7530 1727096060.86421: variable 'ansible_search_path' from source: unknown 7530 1727096060.86425: variable 'ansible_search_path' from source: unknown 7530 1727096060.86456: calling self._execute() 7530 1727096060.86546: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.86550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.86560: variable 'omit' from source: magic vars 7530 1727096060.86849: variable 'ansible_distribution_major_version' from source: facts 7530 1727096060.86859: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096060.87006: variable 'type' from source: play vars 7530 1727096060.87010: variable 'state' from source: include params 7530 1727096060.87014: variable 'interface' from source: play vars 7530 1727096060.87017: variable 'current_interfaces' from source: set_fact 7530 1727096060.87027: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7530 1727096060.87030: variable 'omit' from source: magic vars 7530 1727096060.87058: variable 'omit' from source: magic vars 7530 1727096060.87134: variable 'interface' from source: play vars 7530 1727096060.87146: variable 'omit' from source: magic vars 7530 1727096060.87184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096060.87211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096060.87227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096060.87242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096060.87259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096060.87285: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096060.87289: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.87291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.87369: Set connection var ansible_pipelining to False 7530 1727096060.87376: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096060.87382: Set connection var ansible_timeout to 10 7530 1727096060.87390: Set connection var ansible_shell_executable to /bin/sh 7530 1727096060.87392: Set connection var ansible_shell_type to sh 7530 1727096060.87394: Set connection var ansible_connection to ssh 7530 1727096060.87415: variable 'ansible_shell_executable' from source: unknown 7530 1727096060.87418: variable 'ansible_connection' from source: unknown 7530 1727096060.87420: variable 'ansible_module_compression' from source: unknown 7530 1727096060.87422: variable 'ansible_shell_type' from source: unknown 7530 1727096060.87425: variable 'ansible_shell_executable' from source: unknown 7530 1727096060.87427: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096060.87431: variable 'ansible_pipelining' from source: unknown 7530 1727096060.87434: variable 'ansible_timeout' from source: unknown 7530 1727096060.87438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096060.87550: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096060.87559: variable 'omit' from source: magic vars 7530 1727096060.87565: starting attempt loop 7530 1727096060.87569: running the handler 7530 1727096060.87585: _low_level_execute_command(): starting 7530 1727096060.87592: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096060.88129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.88135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096060.88141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.88174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.88190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.88242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.89980: stdout chunk (state=3): >>>/root <<< 7530 1727096060.90073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.90108: stderr chunk (state=3): >>><<< 7530 1727096060.90111: stdout chunk (state=3): >>><<< 7530 1727096060.90133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096060.90147: _low_level_execute_command(): starting 7530 1727096060.90153: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623 `" && echo ansible-tmp-1727096060.9013405-9422-110573929472623="` echo /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623 `" ) && sleep 0' 7530 1727096060.90640: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.90653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.90656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.90658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.90701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.90704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.90706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.90757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.92785: stdout chunk (state=3): >>>ansible-tmp-1727096060.9013405-9422-110573929472623=/root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623 <<< 7530 1727096060.92882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.92916: stderr chunk (state=3): >>><<< 7530 1727096060.92919: stdout chunk (state=3): >>><<< 7530 1727096060.92936: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096060.9013405-9422-110573929472623=/root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096060.92969: variable 'ansible_module_compression' from source: unknown 7530 1727096060.93016: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096060.93048: variable 'ansible_facts' from source: unknown 7530 1727096060.93105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/AnsiballZ_command.py 7530 1727096060.93214: Sending initial data 7530 1727096060.93218: Sent initial data (154 bytes) 7530 1727096060.93672: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.93702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.93705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.93707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.93763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.93766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.93773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.93819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.95507: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096060.95530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096060.95564: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpgc0ng0mc /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/AnsiballZ_command.py <<< 7530 1727096060.95572: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/AnsiballZ_command.py" <<< 7530 1727096060.95597: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpgc0ng0mc" to remote "/root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/AnsiballZ_command.py" <<< 7530 1727096060.95600: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/AnsiballZ_command.py" <<< 7530 1727096060.96120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.96180: stderr chunk (state=3): >>><<< 7530 1727096060.96205: stdout chunk (state=3): >>><<< 7530 1727096060.96240: done transferring module to remote 7530 1727096060.96248: _low_level_execute_command(): starting 7530 1727096060.96253: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/ /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/AnsiballZ_command.py && sleep 0' 7530 1727096060.96732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.96736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.96746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.96748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.96794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.96797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.96799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.96846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096060.98715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096060.98741: stderr chunk (state=3): >>><<< 7530 1727096060.98744: stdout chunk (state=3): >>><<< 7530 1727096060.98761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096060.98764: _low_level_execute_command(): starting 7530 1727096060.98771: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/AnsiballZ_command.py && sleep 0' 7530 1727096060.99252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.99255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.99258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096060.99260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096060.99308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096060.99311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096060.99314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096060.99365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096061.16428: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-23 08:54:21.151891", "end": "2024-09-23 08:54:21.160821", "delta": "0:00:00.008930", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 7530 1727096061.16499: stdout chunk (state=3): >>> <<< 7530 1727096061.19215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096061.19247: stderr chunk (state=3): >>><<< 7530 1727096061.19250: stdout chunk (state=3): >>><<< 7530 1727096061.19269: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-23 08:54:21.151891", "end": "2024-09-23 08:54:21.160821", "delta": "0:00:00.008930", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096061.19299: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096061.19308: _low_level_execute_command(): starting 7530 1727096061.19313: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096060.9013405-9422-110573929472623/ > /dev/null 2>&1 && sleep 0' 7530 1727096061.19782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.19791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096061.19793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.19795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096061.19851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096061.19859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096061.19861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096061.19895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096061.21960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096061.21965: stderr chunk (state=3): >>><<< 7530 1727096061.21972: stdout chunk (state=3): >>><<< 7530 1727096061.22177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096061.22181: handler run complete 7530 1727096061.22183: Evaluated conditional (False): False 7530 1727096061.22185: attempt loop complete, returning result 7530 1727096061.22186: _execute() done 7530 1727096061.22188: dumping result to json 7530 1727096061.22190: done dumping result, returning 7530 1727096061.22192: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0afff68d-5257-086b-f4f0-000000001a75] 7530 1727096061.22194: sending task result for task 0afff68d-5257-086b-f4f0-000000001a75 ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.008930", "end": "2024-09-23 08:54:21.160821", "rc": 0, "start": "2024-09-23 08:54:21.151891" } 7530 1727096061.22417: no more pending results, returning what we have 7530 1727096061.22420: results queue empty 7530 1727096061.22421: checking for any_errors_fatal 7530 1727096061.22433: done checking for any_errors_fatal 7530 1727096061.22433: checking for max_fail_percentage 7530 1727096061.22435: done checking for max_fail_percentage 7530 1727096061.22439: checking to see if all hosts have failed and the running result is not ok 7530 1727096061.22441: done checking to see if all hosts have failed 7530 1727096061.22442: getting the remaining hosts for this loop 7530 1727096061.22444: done getting the remaining hosts for this loop 7530 1727096061.22448: getting the next task for host managed_node3 7530 1727096061.22457: done getting next task for host managed_node3 7530 1727096061.22459: ^ task is: TASK: Create dummy interface {{ interface }} 7530 1727096061.22463: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096061.22471: getting variables 7530 1727096061.22473: in VariableManager get_vars() 7530 1727096061.22545: Calling all_inventory to load vars for managed_node3 7530 1727096061.22549: Calling groups_inventory to load vars for managed_node3 7530 1727096061.22552: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096061.22571: Calling all_plugins_play to load vars for managed_node3 7530 1727096061.22575: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096061.22579: Calling groups_plugins_play to load vars for managed_node3 7530 1727096061.23283: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a75 7530 1727096061.23287: WORKER PROCESS EXITING 7530 1727096061.23514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096061.24528: done with get_vars() 7530 1727096061.24567: done getting variables 7530 1727096061.24644: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096061.24771: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Monday 23 September 2024 08:54:21 -0400 (0:00:00.389) 0:00:52.036 ****** 7530 1727096061.24803: entering _queue_task() for managed_node3/command 7530 1727096061.25157: worker is 1 (out of 1 available) 7530 1727096061.25170: exiting _queue_task() for managed_node3/command 7530 1727096061.25185: done queuing things up, now waiting for results queue to drain 7530 1727096061.25187: waiting for pending results... 7530 1727096061.25592: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7530 1727096061.25631: in run() - task 0afff68d-5257-086b-f4f0-000000001a76 7530 1727096061.25657: variable 'ansible_search_path' from source: unknown 7530 1727096061.25664: variable 'ansible_search_path' from source: unknown 7530 1727096061.25712: calling self._execute() 7530 1727096061.25845: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.25857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.25875: variable 'omit' from source: magic vars 7530 1727096061.26291: variable 'ansible_distribution_major_version' from source: facts 7530 1727096061.26302: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096061.26453: variable 'type' from source: play vars 7530 1727096061.26458: variable 'state' from source: include params 7530 1727096061.26461: variable 'interface' from source: play vars 7530 1727096061.26463: variable 'current_interfaces' from source: set_fact 7530 1727096061.26466: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7530 1727096061.26470: when evaluation is False, skipping this task 7530 1727096061.26476: _execute() done 7530 1727096061.26479: dumping result to json 7530 1727096061.26482: done dumping result, returning 7530 1727096061.26488: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0afff68d-5257-086b-f4f0-000000001a76] 7530 1727096061.26494: sending task result for task 0afff68d-5257-086b-f4f0-000000001a76 7530 1727096061.26585: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a76 7530 1727096061.26588: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096061.26639: no more pending results, returning what we have 7530 1727096061.26643: results queue empty 7530 1727096061.26645: checking for any_errors_fatal 7530 1727096061.26652: done checking for any_errors_fatal 7530 1727096061.26653: checking for max_fail_percentage 7530 1727096061.26655: done checking for max_fail_percentage 7530 1727096061.26656: checking to see if all hosts have failed and the running result is not ok 7530 1727096061.26657: done checking to see if all hosts have failed 7530 1727096061.26657: getting the remaining hosts for this loop 7530 1727096061.26659: done getting the remaining hosts for this loop 7530 1727096061.26662: getting the next task for host managed_node3 7530 1727096061.26677: done getting next task for host managed_node3 7530 1727096061.26680: ^ task is: TASK: Delete dummy interface {{ interface }} 7530 1727096061.26684: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096061.26689: getting variables 7530 1727096061.26691: in VariableManager get_vars() 7530 1727096061.26744: Calling all_inventory to load vars for managed_node3 7530 1727096061.26747: Calling groups_inventory to load vars for managed_node3 7530 1727096061.26750: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096061.26762: Calling all_plugins_play to load vars for managed_node3 7530 1727096061.26765: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096061.26774: Calling groups_plugins_play to load vars for managed_node3 7530 1727096061.27690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096061.28933: done with get_vars() 7530 1727096061.28996: done getting variables 7530 1727096061.29091: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096061.29248: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Monday 23 September 2024 08:54:21 -0400 (0:00:00.044) 0:00:52.081 ****** 7530 1727096061.29291: entering _queue_task() for managed_node3/command 7530 1727096061.29577: worker is 1 (out of 1 available) 7530 1727096061.29591: exiting _queue_task() for managed_node3/command 7530 1727096061.29605: done queuing things up, now waiting for results queue to drain 7530 1727096061.29607: waiting for pending results... 7530 1727096061.29790: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7530 1727096061.29875: in run() - task 0afff68d-5257-086b-f4f0-000000001a77 7530 1727096061.29889: variable 'ansible_search_path' from source: unknown 7530 1727096061.29893: variable 'ansible_search_path' from source: unknown 7530 1727096061.29922: calling self._execute() 7530 1727096061.30014: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.30019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.30028: variable 'omit' from source: magic vars 7530 1727096061.30318: variable 'ansible_distribution_major_version' from source: facts 7530 1727096061.30328: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096061.30471: variable 'type' from source: play vars 7530 1727096061.30475: variable 'state' from source: include params 7530 1727096061.30485: variable 'interface' from source: play vars 7530 1727096061.30488: variable 'current_interfaces' from source: set_fact 7530 1727096061.30492: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7530 1727096061.30494: when evaluation is False, skipping this task 7530 1727096061.30497: _execute() done 7530 1727096061.30499: dumping result to json 7530 1727096061.30502: done dumping result, returning 7530 1727096061.30510: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0afff68d-5257-086b-f4f0-000000001a77] 7530 1727096061.30513: sending task result for task 0afff68d-5257-086b-f4f0-000000001a77 7530 1727096061.30599: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a77 7530 1727096061.30601: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096061.30659: no more pending results, returning what we have 7530 1727096061.30662: results queue empty 7530 1727096061.30663: checking for any_errors_fatal 7530 1727096061.30672: done checking for any_errors_fatal 7530 1727096061.30673: checking for max_fail_percentage 7530 1727096061.30675: done checking for max_fail_percentage 7530 1727096061.30676: checking to see if all hosts have failed and the running result is not ok 7530 1727096061.30677: done checking to see if all hosts have failed 7530 1727096061.30678: getting the remaining hosts for this loop 7530 1727096061.30679: done getting the remaining hosts for this loop 7530 1727096061.30683: getting the next task for host managed_node3 7530 1727096061.30689: done getting next task for host managed_node3 7530 1727096061.30692: ^ task is: TASK: Create tap interface {{ interface }} 7530 1727096061.30696: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096061.30700: getting variables 7530 1727096061.30702: in VariableManager get_vars() 7530 1727096061.30755: Calling all_inventory to load vars for managed_node3 7530 1727096061.30758: Calling groups_inventory to load vars for managed_node3 7530 1727096061.30760: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096061.30779: Calling all_plugins_play to load vars for managed_node3 7530 1727096061.30782: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096061.30785: Calling groups_plugins_play to load vars for managed_node3 7530 1727096061.31610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096061.37803: done with get_vars() 7530 1727096061.37829: done getting variables 7530 1727096061.37871: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096061.37941: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Monday 23 September 2024 08:54:21 -0400 (0:00:00.086) 0:00:52.167 ****** 7530 1727096061.37962: entering _queue_task() for managed_node3/command 7530 1727096061.38240: worker is 1 (out of 1 available) 7530 1727096061.38252: exiting _queue_task() for managed_node3/command 7530 1727096061.38269: done queuing things up, now waiting for results queue to drain 7530 1727096061.38272: waiting for pending results... 7530 1727096061.38466: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7530 1727096061.38555: in run() - task 0afff68d-5257-086b-f4f0-000000001a78 7530 1727096061.38574: variable 'ansible_search_path' from source: unknown 7530 1727096061.38578: variable 'ansible_search_path' from source: unknown 7530 1727096061.38607: calling self._execute() 7530 1727096061.38700: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.38705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.38715: variable 'omit' from source: magic vars 7530 1727096061.39087: variable 'ansible_distribution_major_version' from source: facts 7530 1727096061.39091: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096061.39426: variable 'type' from source: play vars 7530 1727096061.39429: variable 'state' from source: include params 7530 1727096061.39431: variable 'interface' from source: play vars 7530 1727096061.39433: variable 'current_interfaces' from source: set_fact 7530 1727096061.39436: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7530 1727096061.39438: when evaluation is False, skipping this task 7530 1727096061.39441: _execute() done 7530 1727096061.39442: dumping result to json 7530 1727096061.39445: done dumping result, returning 7530 1727096061.39447: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0afff68d-5257-086b-f4f0-000000001a78] 7530 1727096061.39448: sending task result for task 0afff68d-5257-086b-f4f0-000000001a78 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096061.39587: no more pending results, returning what we have 7530 1727096061.39591: results queue empty 7530 1727096061.39592: checking for any_errors_fatal 7530 1727096061.39599: done checking for any_errors_fatal 7530 1727096061.39600: checking for max_fail_percentage 7530 1727096061.39602: done checking for max_fail_percentage 7530 1727096061.39603: checking to see if all hosts have failed and the running result is not ok 7530 1727096061.39605: done checking to see if all hosts have failed 7530 1727096061.39605: getting the remaining hosts for this loop 7530 1727096061.39607: done getting the remaining hosts for this loop 7530 1727096061.39611: getting the next task for host managed_node3 7530 1727096061.39629: done getting next task for host managed_node3 7530 1727096061.39633: ^ task is: TASK: Delete tap interface {{ interface }} 7530 1727096061.39636: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096061.39641: getting variables 7530 1727096061.39643: in VariableManager get_vars() 7530 1727096061.39700: Calling all_inventory to load vars for managed_node3 7530 1727096061.39703: Calling groups_inventory to load vars for managed_node3 7530 1727096061.39705: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096061.39712: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a78 7530 1727096061.39714: WORKER PROCESS EXITING 7530 1727096061.39733: Calling all_plugins_play to load vars for managed_node3 7530 1727096061.39737: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096061.39740: Calling groups_plugins_play to load vars for managed_node3 7530 1727096061.40904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096061.41810: done with get_vars() 7530 1727096061.41836: done getting variables 7530 1727096061.41887: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7530 1727096061.41978: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Monday 23 September 2024 08:54:21 -0400 (0:00:00.040) 0:00:52.208 ****** 7530 1727096061.42004: entering _queue_task() for managed_node3/command 7530 1727096061.42291: worker is 1 (out of 1 available) 7530 1727096061.42305: exiting _queue_task() for managed_node3/command 7530 1727096061.42318: done queuing things up, now waiting for results queue to drain 7530 1727096061.42320: waiting for pending results... 7530 1727096061.42560: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7530 1727096061.42686: in run() - task 0afff68d-5257-086b-f4f0-000000001a79 7530 1727096061.42704: variable 'ansible_search_path' from source: unknown 7530 1727096061.42707: variable 'ansible_search_path' from source: unknown 7530 1727096061.42746: calling self._execute() 7530 1727096061.42848: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.42852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.42866: variable 'omit' from source: magic vars 7530 1727096061.43312: variable 'ansible_distribution_major_version' from source: facts 7530 1727096061.43316: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096061.43532: variable 'type' from source: play vars 7530 1727096061.43549: variable 'state' from source: include params 7530 1727096061.43556: variable 'interface' from source: play vars 7530 1727096061.43775: variable 'current_interfaces' from source: set_fact 7530 1727096061.43779: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7530 1727096061.43783: when evaluation is False, skipping this task 7530 1727096061.43785: _execute() done 7530 1727096061.43788: dumping result to json 7530 1727096061.43790: done dumping result, returning 7530 1727096061.43792: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0afff68d-5257-086b-f4f0-000000001a79] 7530 1727096061.43794: sending task result for task 0afff68d-5257-086b-f4f0-000000001a79 7530 1727096061.43924: done sending task result for task 0afff68d-5257-086b-f4f0-000000001a79 7530 1727096061.43928: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7530 1727096061.44020: no more pending results, returning what we have 7530 1727096061.44024: results queue empty 7530 1727096061.44025: checking for any_errors_fatal 7530 1727096061.44035: done checking for any_errors_fatal 7530 1727096061.44035: checking for max_fail_percentage 7530 1727096061.44037: done checking for max_fail_percentage 7530 1727096061.44038: checking to see if all hosts have failed and the running result is not ok 7530 1727096061.44039: done checking to see if all hosts have failed 7530 1727096061.44039: getting the remaining hosts for this loop 7530 1727096061.44041: done getting the remaining hosts for this loop 7530 1727096061.44044: getting the next task for host managed_node3 7530 1727096061.44053: done getting next task for host managed_node3 7530 1727096061.44056: ^ task is: TASK: Verify network state restored to default 7530 1727096061.44058: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096061.44062: getting variables 7530 1727096061.44063: in VariableManager get_vars() 7530 1727096061.44113: Calling all_inventory to load vars for managed_node3 7530 1727096061.44116: Calling groups_inventory to load vars for managed_node3 7530 1727096061.44118: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096061.44128: Calling all_plugins_play to load vars for managed_node3 7530 1727096061.44131: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096061.44133: Calling groups_plugins_play to load vars for managed_node3 7530 1727096061.45121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096061.46285: done with get_vars() 7530 1727096061.46327: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:149 Monday 23 September 2024 08:54:21 -0400 (0:00:00.044) 0:00:52.252 ****** 7530 1727096061.46430: entering _queue_task() for managed_node3/include_tasks 7530 1727096061.46794: worker is 1 (out of 1 available) 7530 1727096061.46805: exiting _queue_task() for managed_node3/include_tasks 7530 1727096061.46819: done queuing things up, now waiting for results queue to drain 7530 1727096061.46820: waiting for pending results... 7530 1727096061.47160: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 7530 1727096061.47259: in run() - task 0afff68d-5257-086b-f4f0-000000000151 7530 1727096061.47282: variable 'ansible_search_path' from source: unknown 7530 1727096061.47314: calling self._execute() 7530 1727096061.47420: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.47425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.47432: variable 'omit' from source: magic vars 7530 1727096061.47733: variable 'ansible_distribution_major_version' from source: facts 7530 1727096061.47747: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096061.47759: _execute() done 7530 1727096061.47762: dumping result to json 7530 1727096061.47765: done dumping result, returning 7530 1727096061.47773: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0afff68d-5257-086b-f4f0-000000000151] 7530 1727096061.47779: sending task result for task 0afff68d-5257-086b-f4f0-000000000151 7530 1727096061.47874: done sending task result for task 0afff68d-5257-086b-f4f0-000000000151 7530 1727096061.47877: WORKER PROCESS EXITING 7530 1727096061.47914: no more pending results, returning what we have 7530 1727096061.47919: in VariableManager get_vars() 7530 1727096061.47980: Calling all_inventory to load vars for managed_node3 7530 1727096061.47983: Calling groups_inventory to load vars for managed_node3 7530 1727096061.47987: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096061.48000: Calling all_plugins_play to load vars for managed_node3 7530 1727096061.48003: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096061.48005: Calling groups_plugins_play to load vars for managed_node3 7530 1727096061.50484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096061.53004: done with get_vars() 7530 1727096061.53039: variable 'ansible_search_path' from source: unknown 7530 1727096061.53058: we have included files to process 7530 1727096061.53060: generating all_blocks data 7530 1727096061.53063: done generating all_blocks data 7530 1727096061.53175: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7530 1727096061.53177: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7530 1727096061.53181: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7530 1727096061.54113: done processing included file 7530 1727096061.54115: iterating over new_blocks loaded from include file 7530 1727096061.54117: in VariableManager get_vars() 7530 1727096061.54151: done with get_vars() 7530 1727096061.54153: filtering new block on tags 7530 1727096061.54225: done filtering new block on tags 7530 1727096061.54228: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 7530 1727096061.54235: extending task lists for all hosts with included blocks 7530 1727096061.61477: done extending task lists 7530 1727096061.61479: done processing included files 7530 1727096061.61480: results queue empty 7530 1727096061.61481: checking for any_errors_fatal 7530 1727096061.61484: done checking for any_errors_fatal 7530 1727096061.61485: checking for max_fail_percentage 7530 1727096061.61486: done checking for max_fail_percentage 7530 1727096061.61487: checking to see if all hosts have failed and the running result is not ok 7530 1727096061.61488: done checking to see if all hosts have failed 7530 1727096061.61489: getting the remaining hosts for this loop 7530 1727096061.61490: done getting the remaining hosts for this loop 7530 1727096061.61492: getting the next task for host managed_node3 7530 1727096061.61496: done getting next task for host managed_node3 7530 1727096061.61498: ^ task is: TASK: Check routes and DNS 7530 1727096061.61500: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096061.61503: getting variables 7530 1727096061.61504: in VariableManager get_vars() 7530 1727096061.61526: Calling all_inventory to load vars for managed_node3 7530 1727096061.61528: Calling groups_inventory to load vars for managed_node3 7530 1727096061.61530: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096061.61537: Calling all_plugins_play to load vars for managed_node3 7530 1727096061.61539: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096061.61541: Calling groups_plugins_play to load vars for managed_node3 7530 1727096061.62672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096061.64207: done with get_vars() 7530 1727096061.64240: done getting variables 7530 1727096061.64298: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 08:54:21 -0400 (0:00:00.178) 0:00:52.431 ****** 7530 1727096061.64330: entering _queue_task() for managed_node3/shell 7530 1727096061.64699: worker is 1 (out of 1 available) 7530 1727096061.64712: exiting _queue_task() for managed_node3/shell 7530 1727096061.64838: done queuing things up, now waiting for results queue to drain 7530 1727096061.64840: waiting for pending results... 7530 1727096061.65187: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 7530 1727096061.65193: in run() - task 0afff68d-5257-086b-f4f0-000000001d93 7530 1727096061.65196: variable 'ansible_search_path' from source: unknown 7530 1727096061.65199: variable 'ansible_search_path' from source: unknown 7530 1727096061.65202: calling self._execute() 7530 1727096061.65303: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.65308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.65320: variable 'omit' from source: magic vars 7530 1727096061.65720: variable 'ansible_distribution_major_version' from source: facts 7530 1727096061.65732: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096061.65742: variable 'omit' from source: magic vars 7530 1727096061.65783: variable 'omit' from source: magic vars 7530 1727096061.65827: variable 'omit' from source: magic vars 7530 1727096061.65872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096061.65912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096061.65941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096061.65957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096061.65969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096061.66072: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096061.66076: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.66079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.66110: Set connection var ansible_pipelining to False 7530 1727096061.66116: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096061.66122: Set connection var ansible_timeout to 10 7530 1727096061.66132: Set connection var ansible_shell_executable to /bin/sh 7530 1727096061.66135: Set connection var ansible_shell_type to sh 7530 1727096061.66141: Set connection var ansible_connection to ssh 7530 1727096061.66171: variable 'ansible_shell_executable' from source: unknown 7530 1727096061.66175: variable 'ansible_connection' from source: unknown 7530 1727096061.66178: variable 'ansible_module_compression' from source: unknown 7530 1727096061.66180: variable 'ansible_shell_type' from source: unknown 7530 1727096061.66182: variable 'ansible_shell_executable' from source: unknown 7530 1727096061.66184: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096061.66187: variable 'ansible_pipelining' from source: unknown 7530 1727096061.66189: variable 'ansible_timeout' from source: unknown 7530 1727096061.66192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096061.66340: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096061.66534: variable 'omit' from source: magic vars 7530 1727096061.66540: starting attempt loop 7530 1727096061.66542: running the handler 7530 1727096061.66545: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096061.66547: _low_level_execute_command(): starting 7530 1727096061.66550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096061.67139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096061.67156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096061.67175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.67293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096061.67307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096061.67381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096061.69088: stdout chunk (state=3): >>>/root <<< 7530 1727096061.69183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096061.69248: stderr chunk (state=3): >>><<< 7530 1727096061.69262: stdout chunk (state=3): >>><<< 7530 1727096061.69292: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096061.69312: _low_level_execute_command(): starting 7530 1727096061.69322: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702 `" && echo ansible-tmp-1727096061.6929975-9467-101402458748702="` echo /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702 `" ) && sleep 0' 7530 1727096061.69991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096061.70006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096061.70023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.70057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096061.70149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096061.70192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096061.70206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096061.70276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096061.72263: stdout chunk (state=3): >>>ansible-tmp-1727096061.6929975-9467-101402458748702=/root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702 <<< 7530 1727096061.72426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096061.72430: stdout chunk (state=3): >>><<< 7530 1727096061.72439: stderr chunk (state=3): >>><<< 7530 1727096061.72458: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096061.6929975-9467-101402458748702=/root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096061.72522: variable 'ansible_module_compression' from source: unknown 7530 1727096061.72557: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096061.72605: variable 'ansible_facts' from source: unknown 7530 1727096061.72697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/AnsiballZ_command.py 7530 1727096061.72921: Sending initial data 7530 1727096061.72924: Sent initial data (154 bytes) 7530 1727096061.73573: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096061.73577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096061.73580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.73587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.73590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096061.73593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096061.73672: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096061.73690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096061.73754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096061.75414: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096061.75477: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096061.75519: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmpn8exgf0e /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/AnsiballZ_command.py <<< 7530 1727096061.75523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/AnsiballZ_command.py" <<< 7530 1727096061.75587: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmpn8exgf0e" to remote "/root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/AnsiballZ_command.py" <<< 7530 1727096061.76423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096061.76440: stderr chunk (state=3): >>><<< 7530 1727096061.76576: stdout chunk (state=3): >>><<< 7530 1727096061.76579: done transferring module to remote 7530 1727096061.76581: _low_level_execute_command(): starting 7530 1727096061.76584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/ /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/AnsiballZ_command.py && sleep 0' 7530 1727096061.77211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7530 1727096061.77215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096061.77217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.77219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096061.77222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 7530 1727096061.77228: stderr chunk (state=3): >>>debug2: match not found <<< 7530 1727096061.77230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096061.77282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096061.77743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096061.78053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096061.79769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096061.79774: stdout chunk (state=3): >>><<< 7530 1727096061.79777: stderr chunk (state=3): >>><<< 7530 1727096061.79796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096061.79799: _low_level_execute_command(): starting 7530 1727096061.79805: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/AnsiballZ_command.py && sleep 0' 7530 1727096061.80924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.80929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096061.81112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096061.81117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096061.81119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096061.81288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096061.81324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096061.97954: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3418sec preferred_lft 3418sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:54:21.967967", "end": "2024-09-23 08:54:21.976747", "delta": "0:00:00.008780", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096061.99576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096061.99612: stderr chunk (state=3): >>><<< 7530 1727096061.99615: stdout chunk (state=3): >>><<< 7530 1727096061.99641: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3418sec preferred_lft 3418sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:54:21.967967", "end": "2024-09-23 08:54:21.976747", "delta": "0:00:00.008780", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096061.99692: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096061.99742: _low_level_execute_command(): starting 7530 1727096061.99745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096061.6929975-9467-101402458748702/ > /dev/null 2>&1 && sleep 0' 7530 1727096062.00486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.00534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.00644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096062.00662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096062.00742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096062.02675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096062.02722: stdout chunk (state=3): >>><<< 7530 1727096062.02725: stderr chunk (state=3): >>><<< 7530 1727096062.02782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096062.02812: handler run complete 7530 1727096062.02986: Evaluated conditional (False): False 7530 1727096062.02990: attempt loop complete, returning result 7530 1727096062.02992: _execute() done 7530 1727096062.02995: dumping result to json 7530 1727096062.02997: done dumping result, returning 7530 1727096062.02999: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0afff68d-5257-086b-f4f0-000000001d93] 7530 1727096062.03001: sending task result for task 0afff68d-5257-086b-f4f0-000000001d93 7530 1727096062.03090: done sending task result for task 0afff68d-5257-086b-f4f0-000000001d93 7530 1727096062.03094: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008780", "end": "2024-09-23 08:54:21.976747", "rc": 0, "start": "2024-09-23 08:54:21.967967" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3418sec preferred_lft 3418sec inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 7530 1727096062.03191: no more pending results, returning what we have 7530 1727096062.03196: results queue empty 7530 1727096062.03197: checking for any_errors_fatal 7530 1727096062.03199: done checking for any_errors_fatal 7530 1727096062.03199: checking for max_fail_percentage 7530 1727096062.03201: done checking for max_fail_percentage 7530 1727096062.03202: checking to see if all hosts have failed and the running result is not ok 7530 1727096062.03204: done checking to see if all hosts have failed 7530 1727096062.03204: getting the remaining hosts for this loop 7530 1727096062.03206: done getting the remaining hosts for this loop 7530 1727096062.03209: getting the next task for host managed_node3 7530 1727096062.03217: done getting next task for host managed_node3 7530 1727096062.03219: ^ task is: TASK: Verify DNS and network connectivity 7530 1727096062.03223: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096062.03227: getting variables 7530 1727096062.03229: in VariableManager get_vars() 7530 1727096062.03393: Calling all_inventory to load vars for managed_node3 7530 1727096062.03396: Calling groups_inventory to load vars for managed_node3 7530 1727096062.03399: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096062.03417: Calling all_plugins_play to load vars for managed_node3 7530 1727096062.03420: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096062.03424: Calling groups_plugins_play to load vars for managed_node3 7530 1727096062.05554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096062.06724: done with get_vars() 7530 1727096062.06750: done getting variables 7530 1727096062.06799: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 08:54:22 -0400 (0:00:00.424) 0:00:52.856 ****** 7530 1727096062.06822: entering _queue_task() for managed_node3/shell 7530 1727096062.07090: worker is 1 (out of 1 available) 7530 1727096062.07104: exiting _queue_task() for managed_node3/shell 7530 1727096062.07116: done queuing things up, now waiting for results queue to drain 7530 1727096062.07118: waiting for pending results... 7530 1727096062.07311: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 7530 1727096062.07383: in run() - task 0afff68d-5257-086b-f4f0-000000001d94 7530 1727096062.07395: variable 'ansible_search_path' from source: unknown 7530 1727096062.07398: variable 'ansible_search_path' from source: unknown 7530 1727096062.07428: calling self._execute() 7530 1727096062.07512: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096062.07515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096062.07524: variable 'omit' from source: magic vars 7530 1727096062.07874: variable 'ansible_distribution_major_version' from source: facts 7530 1727096062.07877: Evaluated conditional (ansible_distribution_major_version != '6'): True 7530 1727096062.08021: variable 'ansible_facts' from source: unknown 7530 1727096062.08778: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 7530 1727096062.08874: variable 'omit' from source: magic vars 7530 1727096062.08877: variable 'omit' from source: magic vars 7530 1727096062.08880: variable 'omit' from source: magic vars 7530 1727096062.08916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7530 1727096062.08969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7530 1727096062.09000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7530 1727096062.09044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096062.09048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7530 1727096062.09062: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7530 1727096062.09069: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096062.09072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096062.09144: Set connection var ansible_pipelining to False 7530 1727096062.09153: Set connection var ansible_module_compression to ZIP_DEFLATED 7530 1727096062.09156: Set connection var ansible_timeout to 10 7530 1727096062.09165: Set connection var ansible_shell_executable to /bin/sh 7530 1727096062.09169: Set connection var ansible_shell_type to sh 7530 1727096062.09172: Set connection var ansible_connection to ssh 7530 1727096062.09192: variable 'ansible_shell_executable' from source: unknown 7530 1727096062.09195: variable 'ansible_connection' from source: unknown 7530 1727096062.09198: variable 'ansible_module_compression' from source: unknown 7530 1727096062.09200: variable 'ansible_shell_type' from source: unknown 7530 1727096062.09202: variable 'ansible_shell_executable' from source: unknown 7530 1727096062.09204: variable 'ansible_host' from source: host vars for 'managed_node3' 7530 1727096062.09207: variable 'ansible_pipelining' from source: unknown 7530 1727096062.09210: variable 'ansible_timeout' from source: unknown 7530 1727096062.09214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7530 1727096062.09318: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096062.09327: variable 'omit' from source: magic vars 7530 1727096062.09332: starting attempt loop 7530 1727096062.09334: running the handler 7530 1727096062.09345: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7530 1727096062.09360: _low_level_execute_command(): starting 7530 1727096062.09370: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7530 1727096062.09890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096062.09894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096062.09897: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 7530 1727096062.09900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.09942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096062.09946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096062.09948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096062.09997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096062.11682: stdout chunk (state=3): >>>/root <<< 7530 1727096062.11839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096062.11843: stdout chunk (state=3): >>><<< 7530 1727096062.11845: stderr chunk (state=3): >>><<< 7530 1727096062.11873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096062.11897: _low_level_execute_command(): starting 7530 1727096062.11946: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083 `" && echo ansible-tmp-1727096062.1188223-9488-214606277403083="` echo /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083 `" ) && sleep 0' 7530 1727096062.12423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096062.12453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096062.12458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.12460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096062.12462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.12514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096062.12517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096062.12560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096062.14539: stdout chunk (state=3): >>>ansible-tmp-1727096062.1188223-9488-214606277403083=/root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083 <<< 7530 1727096062.14645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096062.14678: stderr chunk (state=3): >>><<< 7530 1727096062.14689: stdout chunk (state=3): >>><<< 7530 1727096062.14743: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096062.1188223-9488-214606277403083=/root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096062.14770: variable 'ansible_module_compression' from source: unknown 7530 1727096062.14847: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-75303i9bq39a/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7530 1727096062.14892: variable 'ansible_facts' from source: unknown 7530 1727096062.15063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/AnsiballZ_command.py 7530 1727096062.15220: Sending initial data 7530 1727096062.15236: Sent initial data (154 bytes) 7530 1727096062.15665: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096062.15684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096062.15697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.15750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096062.15758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096062.15794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096062.17418: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7530 1727096062.17513: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7530 1727096062.17517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/AnsiballZ_command.py" <<< 7530 1727096062.17519: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-75303i9bq39a/tmp3iavb299 /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/AnsiballZ_command.py <<< 7530 1727096062.17555: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-75303i9bq39a/tmp3iavb299" to remote "/root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/AnsiballZ_command.py" <<< 7530 1727096062.18205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096062.18279: stderr chunk (state=3): >>><<< 7530 1727096062.18282: stdout chunk (state=3): >>><<< 7530 1727096062.18304: done transferring module to remote 7530 1727096062.18314: _low_level_execute_command(): starting 7530 1727096062.18318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/ /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/AnsiballZ_command.py && sleep 0' 7530 1727096062.18738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096062.18766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.18776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096062.18778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7530 1727096062.18780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.18822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096062.18825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096062.18863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096062.20666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096062.20705: stderr chunk (state=3): >>><<< 7530 1727096062.20707: stdout chunk (state=3): >>><<< 7530 1727096062.20719: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096062.20773: _low_level_execute_command(): starting 7530 1727096062.20777: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/AnsiballZ_command.py && sleep 0' 7530 1727096062.21184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096062.21187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096062.21189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7530 1727096062.21191: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096062.21193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.21248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096062.21252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096062.21255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096062.21296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096062.69751: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3475 0 --:--:-- --:--:-- --:--:-- 3505\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1380 0 --:--:-- --:--:-- --:--:-- 1385", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:54:22.366458", "end": "2024-09-23 08:54:22.694804", "delta": "0:00:00.328346", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7530 1727096062.71440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 7530 1727096062.71450: stderr chunk (state=3): >>><<< 7530 1727096062.71453: stdout chunk (state=3): >>><<< 7530 1727096062.71535: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3475 0 --:--:-- --:--:-- --:--:-- 3505\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1380 0 --:--:-- --:--:-- --:--:-- 1385", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:54:22.366458", "end": "2024-09-23 08:54:22.694804", "delta": "0:00:00.328346", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 7530 1727096062.71548: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7530 1727096062.71551: _low_level_execute_command(): starting 7530 1727096062.71553: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096062.1188223-9488-214606277403083/ > /dev/null 2>&1 && sleep 0' 7530 1727096062.71946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7530 1727096062.71953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7530 1727096062.71977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 7530 1727096062.71992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 7530 1727096062.71995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7530 1727096062.72044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 7530 1727096062.72047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7530 1727096062.72050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7530 1727096062.72093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7530 1727096062.74029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7530 1727096062.74034: stdout chunk (state=3): >>><<< 7530 1727096062.74037: stderr chunk (state=3): >>><<< 7530 1727096062.74055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7530 1727096062.74061: handler run complete 7530 1727096062.74081: Evaluated conditional (False): False 7530 1727096062.74089: attempt loop complete, returning result 7530 1727096062.74091: _execute() done 7530 1727096062.74094: dumping result to json 7530 1727096062.74099: done dumping result, returning 7530 1727096062.74106: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0afff68d-5257-086b-f4f0-000000001d94] 7530 1727096062.74110: sending task result for task 0afff68d-5257-086b-f4f0-000000001d94 7530 1727096062.74220: done sending task result for task 0afff68d-5257-086b-f4f0-000000001d94 7530 1727096062.74224: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.328346", "end": "2024-09-23 08:54:22.694804", "rc": 0, "start": "2024-09-23 08:54:22.366458" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3475 0 --:--:-- --:--:-- --:--:-- 3505 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1380 0 --:--:-- --:--:-- --:--:-- 1385 7530 1727096062.74297: no more pending results, returning what we have 7530 1727096062.74300: results queue empty 7530 1727096062.74301: checking for any_errors_fatal 7530 1727096062.74312: done checking for any_errors_fatal 7530 1727096062.74313: checking for max_fail_percentage 7530 1727096062.74314: done checking for max_fail_percentage 7530 1727096062.74315: checking to see if all hosts have failed and the running result is not ok 7530 1727096062.74316: done checking to see if all hosts have failed 7530 1727096062.74317: getting the remaining hosts for this loop 7530 1727096062.74324: done getting the remaining hosts for this loop 7530 1727096062.74327: getting the next task for host managed_node3 7530 1727096062.74342: done getting next task for host managed_node3 7530 1727096062.74345: ^ task is: TASK: meta (flush_handlers) 7530 1727096062.74347: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096062.74352: getting variables 7530 1727096062.74354: in VariableManager get_vars() 7530 1727096062.74405: Calling all_inventory to load vars for managed_node3 7530 1727096062.74408: Calling groups_inventory to load vars for managed_node3 7530 1727096062.74410: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096062.74421: Calling all_plugins_play to load vars for managed_node3 7530 1727096062.74424: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096062.74426: Calling groups_plugins_play to load vars for managed_node3 7530 1727096062.75249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096062.76120: done with get_vars() 7530 1727096062.76144: done getting variables 7530 1727096062.76202: in VariableManager get_vars() 7530 1727096062.76218: Calling all_inventory to load vars for managed_node3 7530 1727096062.76220: Calling groups_inventory to load vars for managed_node3 7530 1727096062.76221: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096062.76225: Calling all_plugins_play to load vars for managed_node3 7530 1727096062.76226: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096062.76228: Calling groups_plugins_play to load vars for managed_node3 7530 1727096062.76985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096062.77846: done with get_vars() 7530 1727096062.77875: done queuing things up, now waiting for results queue to drain 7530 1727096062.77876: results queue empty 7530 1727096062.77877: checking for any_errors_fatal 7530 1727096062.77880: done checking for any_errors_fatal 7530 1727096062.77880: checking for max_fail_percentage 7530 1727096062.77881: done checking for max_fail_percentage 7530 1727096062.77881: checking to see if all hosts have failed and the running result is not ok 7530 1727096062.77882: done checking to see if all hosts have failed 7530 1727096062.77882: getting the remaining hosts for this loop 7530 1727096062.77883: done getting the remaining hosts for this loop 7530 1727096062.77885: getting the next task for host managed_node3 7530 1727096062.77888: done getting next task for host managed_node3 7530 1727096062.77889: ^ task is: TASK: meta (flush_handlers) 7530 1727096062.77890: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096062.77892: getting variables 7530 1727096062.77893: in VariableManager get_vars() 7530 1727096062.77906: Calling all_inventory to load vars for managed_node3 7530 1727096062.77908: Calling groups_inventory to load vars for managed_node3 7530 1727096062.77910: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096062.77916: Calling all_plugins_play to load vars for managed_node3 7530 1727096062.77917: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096062.77919: Calling groups_plugins_play to load vars for managed_node3 7530 1727096062.78572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096062.79469: done with get_vars() 7530 1727096062.79485: done getting variables 7530 1727096062.79523: in VariableManager get_vars() 7530 1727096062.79539: Calling all_inventory to load vars for managed_node3 7530 1727096062.79541: Calling groups_inventory to load vars for managed_node3 7530 1727096062.79543: Calling all_plugins_inventory to load vars for managed_node3 7530 1727096062.79547: Calling all_plugins_play to load vars for managed_node3 7530 1727096062.79548: Calling groups_plugins_inventory to load vars for managed_node3 7530 1727096062.79550: Calling groups_plugins_play to load vars for managed_node3 7530 1727096062.80183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7530 1727096062.81043: done with get_vars() 7530 1727096062.81073: done queuing things up, now waiting for results queue to drain 7530 1727096062.81074: results queue empty 7530 1727096062.81075: checking for any_errors_fatal 7530 1727096062.81076: done checking for any_errors_fatal 7530 1727096062.81076: checking for max_fail_percentage 7530 1727096062.81077: done checking for max_fail_percentage 7530 1727096062.81077: checking to see if all hosts have failed and the running result is not ok 7530 1727096062.81078: done checking to see if all hosts have failed 7530 1727096062.81078: getting the remaining hosts for this loop 7530 1727096062.81079: done getting the remaining hosts for this loop 7530 1727096062.81081: getting the next task for host managed_node3 7530 1727096062.81083: done getting next task for host managed_node3 7530 1727096062.81084: ^ task is: None 7530 1727096062.81085: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7530 1727096062.81086: done queuing things up, now waiting for results queue to drain 7530 1727096062.81087: results queue empty 7530 1727096062.81087: checking for any_errors_fatal 7530 1727096062.81087: done checking for any_errors_fatal 7530 1727096062.81088: checking for max_fail_percentage 7530 1727096062.81089: done checking for max_fail_percentage 7530 1727096062.81089: checking to see if all hosts have failed and the running result is not ok 7530 1727096062.81090: done checking to see if all hosts have failed 7530 1727096062.81091: getting the next task for host managed_node3 7530 1727096062.81093: done getting next task for host managed_node3 7530 1727096062.81093: ^ task is: None 7530 1727096062.81094: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=128 changed=4 unreachable=0 failed=0 skipped=118 rescued=0 ignored=0 Monday 23 September 2024 08:54:22 -0400 (0:00:00.743) 0:00:53.599 ****** =============================================================================== Install iproute --------------------------------------------------------- 3.62s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 2.06s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.87s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.23s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.20s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Create veth interface veth0 --------------------------------------------- 1.17s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Create veth interface veth0 --------------------------------------------- 1.12s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.02s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.92s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.83s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.81s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 0.80s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 0.79s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.77s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.77s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.76s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Install iproute --------------------------------------------------------- 0.75s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 7530 1727096062.81213: RUNNING CLEANUP