[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8218 1726776613.97759: starting run ansible-playbook [core 2.16.11] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-uMf executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8218 1726776613.98072: Added group all to inventory 8218 1726776613.98074: Added group ungrouped to inventory 8218 1726776613.98077: Group all now contains ungrouped 8218 1726776613.98079: Examining possible inventory source: /tmp/kernel_settings-iny/inventory.yml 8218 1726776614.06803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8218 1726776614.06846: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8218 1726776614.06865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8218 1726776614.06906: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8218 1726776614.06958: Loaded config def from plugin (inventory/script) 8218 1726776614.06959: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8218 1726776614.06987: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8218 1726776614.07045: Loaded config def from plugin (inventory/yaml) 8218 1726776614.07046: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8218 1726776614.07108: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8218 1726776614.07386: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8218 1726776614.07389: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8218 1726776614.07391: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8218 1726776614.07395: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8218 1726776614.07398: Loading data from /tmp/kernel_settings-iny/inventory.yml 8218 1726776614.07442: /tmp/kernel_settings-iny/inventory.yml was not parsable by auto 8218 1726776614.07485: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8218 1726776614.07512: Loading data from /tmp/kernel_settings-iny/inventory.yml 8218 1726776614.07570: group all already in inventory 8218 1726776614.07575: set inventory_file for managed_node1 8218 1726776614.07578: set inventory_dir for managed_node1 8218 1726776614.07578: Added host managed_node1 to inventory 8218 1726776614.07580: Added host managed_node1 to group all 8218 1726776614.07580: set ansible_host for managed_node1 8218 1726776614.07581: set ansible_ssh_extra_args for managed_node1 8218 1726776614.07583: set inventory_file for managed_node2 8218 1726776614.07584: set inventory_dir for managed_node2 8218 1726776614.07584: Added host managed_node2 to inventory 8218 1726776614.07585: Added host managed_node2 to group all 8218 1726776614.07586: set ansible_host for managed_node2 8218 1726776614.07586: set ansible_ssh_extra_args for managed_node2 8218 1726776614.07588: set inventory_file for managed_node3 8218 1726776614.07589: set inventory_dir for managed_node3 8218 1726776614.07589: Added host managed_node3 to inventory 8218 1726776614.07590: Added host managed_node3 to group all 8218 1726776614.07591: set ansible_host for managed_node3 8218 1726776614.07591: set ansible_ssh_extra_args for managed_node3 8218 1726776614.07593: Reconcile groups and hosts in inventory. 8218 1726776614.07595: Group ungrouped now contains managed_node1 8218 1726776614.07596: Group ungrouped now contains managed_node2 8218 1726776614.07597: Group ungrouped now contains managed_node3 8218 1726776614.07651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8218 1726776614.07734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8218 1726776614.07768: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8218 1726776614.07785: Loaded config def from plugin (vars/host_group_vars) 8218 1726776614.07787: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8218 1726776614.07791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8218 1726776614.07796: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8218 1726776614.07822: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8218 1726776614.08058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776614.08122: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8218 1726776614.08147: Loaded config def from plugin (connection/local) 8218 1726776614.08149: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8218 1726776614.08477: Loaded config def from plugin (connection/paramiko_ssh) 8218 1726776614.08479: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8218 1726776614.09056: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8218 1726776614.09079: Loaded config def from plugin (connection/psrp) 8218 1726776614.09081: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8218 1726776614.09503: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8218 1726776614.09525: Loaded config def from plugin (connection/ssh) 8218 1726776614.09527: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8218 1726776614.10678: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8218 1726776614.10702: Loaded config def from plugin (connection/winrm) 8218 1726776614.10704: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8218 1726776614.10724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8218 1726776614.10773: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8218 1726776614.10811: Loaded config def from plugin (shell/cmd) 8218 1726776614.10813: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8218 1726776614.10831: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8218 1726776614.10869: Loaded config def from plugin (shell/powershell) 8218 1726776614.10870: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8218 1726776614.10907: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8218 1726776614.11009: Loaded config def from plugin (shell/sh) 8218 1726776614.11011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8218 1726776614.11037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8218 1726776614.11108: Loaded config def from plugin (become/runas) 8218 1726776614.11109: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8218 1726776614.11218: Loaded config def from plugin (become/su) 8218 1726776614.11219: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8218 1726776614.11316: Loaded config def from plugin (become/sudo) 8218 1726776614.11317: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8218 1726776614.11341: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml 8218 1726776614.11970: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8218 1726776614.13931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8218 1726776614.13964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 8218 1726776614.15552: in VariableManager get_vars() 8218 1726776614.15568: done with get_vars() 8218 1726776614.15600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8218 1726776614.15608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8218 1726776614.15767: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8218 1726776614.15862: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8218 1726776614.15864: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-uMf/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 8218 1726776614.15885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8218 1726776614.15902: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8218 1726776614.15998: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8218 1726776614.16037: Loaded config def from plugin (callback/default) 8218 1726776614.16038: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8218 1726776614.16774: Loaded config def from plugin (callback/junit) 8218 1726776614.16776: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8218 1726776614.16805: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8218 1726776614.16845: Loaded config def from plugin (callback/minimal) 8218 1726776614.16846: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8218 1726776614.16875: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8218 1726776614.16913: Loaded config def from plugin (callback/tree) 8218 1726776614.16915: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8218 1726776614.16988: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8218 1726776614.16990: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-uMf/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_change_settings.yml ******************************************** 1 plays in /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml 8218 1726776614.17007: in VariableManager get_vars() 8218 1726776614.17017: done with get_vars() 8218 1726776614.17021: in VariableManager get_vars() 8218 1726776614.17027: done with get_vars() 8218 1726776614.17031: variable 'omit' from source: magic vars 8218 1726776614.17057: in VariableManager get_vars() 8218 1726776614.17066: done with get_vars() 8218 1726776614.17079: variable 'omit' from source: magic vars PLAY [Test changing settings] ************************************************** 8218 1726776614.18714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8218 1726776614.18768: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8218 1726776614.18789: getting the remaining hosts for this loop 8218 1726776614.18790: done getting the remaining hosts for this loop 8218 1726776614.18792: getting the next task for host managed_node2 8218 1726776614.18794: done getting next task for host managed_node2 8218 1726776614.18798: ^ task is: TASK: Gathering Facts 8218 1726776614.18799: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776614.18800: getting variables 8218 1726776614.18801: in VariableManager get_vars() 8218 1726776614.18807: Calling all_inventory to load vars for managed_node2 8218 1726776614.18809: Calling groups_inventory to load vars for managed_node2 8218 1726776614.18810: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776614.18818: Calling all_plugins_play to load vars for managed_node2 8218 1726776614.18825: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776614.18829: Calling groups_plugins_play to load vars for managed_node2 8218 1726776614.18851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776614.18882: done with get_vars() 8218 1726776614.18886: done getting variables 8218 1726776614.18934: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:2 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.020) 0:00:00.020 **** 8218 1726776614.18947: entering _queue_task() for managed_node2/gather_facts 8218 1726776614.18948: Creating lock for gather_facts 8218 1726776614.19130: worker is 1 (out of 1 available) 8218 1726776614.19140: exiting _queue_task() for managed_node2/gather_facts 8218 1726776614.19151: done queuing things up, now waiting for results queue to drain 8218 1726776614.19153: waiting for pending results... 8221 1726776614.19226: running TaskExecutor() for managed_node2/TASK: Gathering Facts 8221 1726776614.19308: in run() - task 120fa90a-8a95-cec2-986e-00000000002f 8221 1726776614.19324: variable 'ansible_search_path' from source: unknown 8221 1726776614.19351: calling self._execute() 8221 1726776614.19437: variable 'ansible_host' from source: host vars for 'managed_node2' 8221 1726776614.19447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8221 1726776614.19456: variable 'omit' from source: magic vars 8221 1726776614.19518: variable 'omit' from source: magic vars 8221 1726776614.19541: variable 'omit' from source: magic vars 8221 1726776614.19565: variable 'omit' from source: magic vars 8221 1726776614.19599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8221 1726776614.19624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8221 1726776614.19641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8221 1726776614.19655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8221 1726776614.19666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8221 1726776614.19688: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8221 1726776614.19693: variable 'ansible_host' from source: host vars for 'managed_node2' 8221 1726776614.19698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8221 1726776614.19764: Set connection var ansible_connection to ssh 8221 1726776614.19772: Set connection var ansible_pipelining to False 8221 1726776614.19779: Set connection var ansible_timeout to 10 8221 1726776614.19786: Set connection var ansible_module_compression to ZIP_DEFLATED 8221 1726776614.19791: Set connection var ansible_shell_type to sh 8221 1726776614.19796: Set connection var ansible_shell_executable to /bin/sh 8221 1726776614.19809: variable 'ansible_shell_executable' from source: unknown 8221 1726776614.19812: variable 'ansible_connection' from source: unknown 8221 1726776614.19815: variable 'ansible_module_compression' from source: unknown 8221 1726776614.19817: variable 'ansible_shell_type' from source: unknown 8221 1726776614.19819: variable 'ansible_shell_executable' from source: unknown 8221 1726776614.19820: variable 'ansible_host' from source: host vars for 'managed_node2' 8221 1726776614.19822: variable 'ansible_pipelining' from source: unknown 8221 1726776614.19824: variable 'ansible_timeout' from source: unknown 8221 1726776614.19826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8221 1726776614.19938: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8221 1726776614.19946: variable 'omit' from source: magic vars 8221 1726776614.19950: starting attempt loop 8221 1726776614.19952: running the handler 8221 1726776614.19962: variable 'ansible_facts' from source: unknown 8221 1726776614.19974: _low_level_execute_command(): starting 8221 1726776614.19979: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8221 1726776614.22084: stderr chunk (state=2): >>>Warning: Permanently added '10.31.12.75' (ECDSA) to the list of known hosts. <<< 8221 1726776614.34608: stdout chunk (state=3): >>>/root <<< 8221 1726776614.34911: stderr chunk (state=3): >>><<< 8221 1726776614.34922: stdout chunk (state=3): >>><<< 8221 1726776614.34947: _low_level_execute_command() done: rc=0, stdout=/root , stderr=Warning: Permanently added '10.31.12.75' (ECDSA) to the list of known hosts. 8221 1726776614.34965: _low_level_execute_command(): starting 8221 1726776614.34974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246 `" && echo ansible-tmp-1726776614.3495598-8221-182168403501246="` echo /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246 `" ) && sleep 0' 8221 1726776614.37687: stdout chunk (state=2): >>>ansible-tmp-1726776614.3495598-8221-182168403501246=/root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246 <<< 8221 1726776614.37831: stderr chunk (state=3): >>><<< 8221 1726776614.37838: stdout chunk (state=3): >>><<< 8221 1726776614.37854: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776614.3495598-8221-182168403501246=/root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246 , stderr= 8221 1726776614.37882: variable 'ansible_module_compression' from source: unknown 8221 1726776614.37938: ANSIBALLZ: Using generic lock for ansible.legacy.setup 8221 1726776614.37944: ANSIBALLZ: Acquiring lock 8221 1726776614.37947: ANSIBALLZ: Lock acquired: 140571206407024 8221 1726776614.37951: ANSIBALLZ: Creating module 8221 1726776614.69900: ANSIBALLZ: Writing module into payload 8221 1726776614.70071: ANSIBALLZ: Writing module 8221 1726776614.70096: ANSIBALLZ: Renaming module 8221 1726776614.70104: ANSIBALLZ: Done creating module 8221 1726776614.70139: variable 'ansible_facts' from source: unknown 8221 1726776614.70148: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8221 1726776614.70157: _low_level_execute_command(): starting 8221 1726776614.70163: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0' 8221 1726776614.72899: stdout chunk (state=2): >>>PLATFORM <<< 8221 1726776614.72961: stdout chunk (state=3): >>>Linux <<< 8221 1726776614.72992: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3.6 <<< 8221 1726776614.73020: stdout chunk (state=3): >>>/usr/bin/python3 /usr/libexec/platform-python ENDFOUND <<< 8221 1726776614.73181: stderr chunk (state=3): >>><<< 8221 1726776614.73187: stdout chunk (state=3): >>><<< 8221 1726776614.73202: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND , stderr= 8221 1726776614.73208 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3.6', '/usr/bin/python3', '/usr/libexec/platform-python'] 8221 1726776614.73247: _low_level_execute_command(): starting 8221 1726776614.73253: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8221 1726776614.74939: Sending initial data 8221 1726776614.74948: Sent initial data (1234 bytes) 8221 1726776614.80214: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 8221 1726776614.80786: stderr chunk (state=3): >>><<< 8221 1726776614.80794: stdout chunk (state=3): >>><<< 8221 1726776614.80808: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr= 8221 1726776614.80860: variable 'ansible_facts' from source: unknown 8221 1726776614.80866: variable 'ansible_facts' from source: unknown 8221 1726776614.80876: variable 'ansible_module_compression' from source: unknown 8221 1726776614.80907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8221 1726776614.80936: variable 'ansible_facts' from source: unknown 8221 1726776614.81085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246/AnsiballZ_setup.py 8221 1726776614.81193: Sending initial data 8221 1726776614.81200: Sent initial data (152 bytes) 8221 1726776614.84036: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpw2_48yq2 /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246/AnsiballZ_setup.py <<< 8221 1726776614.86868: stderr chunk (state=3): >>><<< 8221 1726776614.86881: stdout chunk (state=3): >>><<< 8221 1726776614.86908: done transferring module to remote 8221 1726776614.86923: _low_level_execute_command(): starting 8221 1726776614.86931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246/ /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246/AnsiballZ_setup.py && sleep 0' 8221 1726776614.89565: stderr chunk (state=2): >>><<< 8221 1726776614.89577: stdout chunk (state=2): >>><<< 8221 1726776614.89594: _low_level_execute_command() done: rc=0, stdout=, stderr= 8221 1726776614.89601: _low_level_execute_command(): starting 8221 1726776614.89607: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246/AnsiballZ_setup.py && sleep 0' 8221 1726776615.90438: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANE82uVxtVfIhrJuj+Z56Gg9bJlDR+yYhhx40nddz9Kp8spHmHNvFWFn+e7QAGUfCb+6Bn9hYTJ01jfHaC5ohG1MVnAda9CG+H9c6PQ+gHSYPJWdLw/SGnwyt5N6bir8gJvf5eqrXg0FFbm02VDZJAH4ww7gBr9WPGM4PY1Xusd3AAAAFQDMFufFoivmmHcgCnY/kt+ytzmOaQAAAIBcsxi9kZhiwOrK7psJGYdQI1cVUnaqKMfAmHz2cWmhI4jYrMQOwdMF3XQMyXgmrePEWFnuov5VbepbLu43oTrQo18/5uhe6kek0DuOeKivfAx8E4a6lh3OiSNw8mu5dYVcLv+bd4Kj97aZb9Gc715QJAj3ImLk7gMK0nFbaUkdZAAAAIA7wufmEs3LK2y8ttz87wJ4frWgcvNvSRJjeZACpPTicryWGrcOtjvdBeYguJ9vlncJisC4nPK3GYKg7yxbmiWL5TPmvQTT6fsy7cLlKkmIbtui/icHcNPTfBXqvJa3ynXTEfNrbid/WOzdTSO0utdrr4LeOgfnqsuif0W/n1CZ7A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCnzq1r0NFAn93EqXQx38G6hyHP8YT9vzeAiqkNxeexjxvgmckt0NM6KskMEaUcog0Gj/4V2FN7mOfj0Gbefb5fespgQ1eR63jDKf/JPDIQ3nN0RnnM77fe7kBba4QXYVDpY7zYR2LorYUWCCWu9hRdsPJzsf22DmXyyZ96gipq1Rg0VnVbyg7oorYufErgnBCXQtgE9Ffr9z2J40csh9GmrmoV5X8JZoSUWePkZXOackxaNHMOBPI5rzUnEjaTP5xZpIc6YAHBQKY0YRocgPk4Bdyku9zRtEeumHFPIvJCfr2mPjdStySjjtKdbJqbKfma3xtz0jVbVoTfltqc6+h5y99ObN99DAe1mmhzp6DrddXYqxdnhd9DaKzrbLE3uRGmSBa2rjmreEjaI+RKLtUwgeWHjgfJQG2OPki9a2VcHfLhBaH0TZhuW7CTEycsVb5YIgqDNdlnV3KDx/cWFHF9q9elURUwgyrD28mszD7cnZEMBmsfdPa3heOOGmZaqkM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNjw/e9P8QOb520cgNDhZpEn2NWDALjpSK9lpaLEOJsey1fh+RV1Bkt5jHN+4WHIvyqRDGP8roN+DvPf+K8h9oI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMHfdyzKk5Ao/6nDckScEFXouLZL8ZutL+VmnNfL8fQj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-75", "ansible_nodename": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "f9cbf545a7bd4357ac99f30c9cf5a21a", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "16", "minute": "10", "second": "15", "epoch": "1726776615", "epoch_int": "1726776615", "date": "2024-09-19", "time": "16:10:15", "iso8601_micro": "2024-09-19T20:10:15.297226Z", "iso8601": "2024-09-19T20:10:15Z", "iso8601_basic": "20240919T161015297226", "iso8601_basic_short": "20240919T161015", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.41, "5m": 0.36, "15m": 0.16}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.10.210 50220 10.31.12.75 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.10.210 50220 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2714, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 825, "free": 2714}, "nocache": {"free": 3301, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec276857-e760-f0b2-0e35-2b06796fb60c", "ansible_product_uuid": "ec276857-e760-f0b2-0e35-2b06796fb60c", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 226, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263481683968, "block_size": 4096, "block_total": 65533179, "block_available": 64326583, "block_used": 1206596, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:c8:82:e3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.75", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fec8:82e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.75", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:c8:82:e3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.75"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fec8:82e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.75", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fec8:82e3"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8221 1726776615.92119: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8221 1726776615.92132: stdout chunk (state=3): >>><<< 8221 1726776615.92142: stderr chunk (state=3): >>><<< 8221 1726776615.92170: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANE82uVxtVfIhrJuj+Z56Gg9bJlDR+yYhhx40nddz9Kp8spHmHNvFWFn+e7QAGUfCb+6Bn9hYTJ01jfHaC5ohG1MVnAda9CG+H9c6PQ+gHSYPJWdLw/SGnwyt5N6bir8gJvf5eqrXg0FFbm02VDZJAH4ww7gBr9WPGM4PY1Xusd3AAAAFQDMFufFoivmmHcgCnY/kt+ytzmOaQAAAIBcsxi9kZhiwOrK7psJGYdQI1cVUnaqKMfAmHz2cWmhI4jYrMQOwdMF3XQMyXgmrePEWFnuov5VbepbLu43oTrQo18/5uhe6kek0DuOeKivfAx8E4a6lh3OiSNw8mu5dYVcLv+bd4Kj97aZb9Gc715QJAj3ImLk7gMK0nFbaUkdZAAAAIA7wufmEs3LK2y8ttz87wJ4frWgcvNvSRJjeZACpPTicryWGrcOtjvdBeYguJ9vlncJisC4nPK3GYKg7yxbmiWL5TPmvQTT6fsy7cLlKkmIbtui/icHcNPTfBXqvJa3ynXTEfNrbid/WOzdTSO0utdrr4LeOgfnqsuif0W/n1CZ7A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCnzq1r0NFAn93EqXQx38G6hyHP8YT9vzeAiqkNxeexjxvgmckt0NM6KskMEaUcog0Gj/4V2FN7mOfj0Gbefb5fespgQ1eR63jDKf/JPDIQ3nN0RnnM77fe7kBba4QXYVDpY7zYR2LorYUWCCWu9hRdsPJzsf22DmXyyZ96gipq1Rg0VnVbyg7oorYufErgnBCXQtgE9Ffr9z2J40csh9GmrmoV5X8JZoSUWePkZXOackxaNHMOBPI5rzUnEjaTP5xZpIc6YAHBQKY0YRocgPk4Bdyku9zRtEeumHFPIvJCfr2mPjdStySjjtKdbJqbKfma3xtz0jVbVoTfltqc6+h5y99ObN99DAe1mmhzp6DrddXYqxdnhd9DaKzrbLE3uRGmSBa2rjmreEjaI+RKLtUwgeWHjgfJQG2OPki9a2VcHfLhBaH0TZhuW7CTEycsVb5YIgqDNdlnV3KDx/cWFHF9q9elURUwgyrD28mszD7cnZEMBmsfdPa3heOOGmZaqkM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNjw/e9P8QOb520cgNDhZpEn2NWDALjpSK9lpaLEOJsey1fh+RV1Bkt5jHN+4WHIvyqRDGP8roN+DvPf+K8h9oI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMHfdyzKk5Ao/6nDckScEFXouLZL8ZutL+VmnNfL8fQj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-75", "ansible_nodename": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "f9cbf545a7bd4357ac99f30c9cf5a21a", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "16", "minute": "10", "second": "15", "epoch": "1726776615", "epoch_int": "1726776615", "date": "2024-09-19", "time": "16:10:15", "iso8601_micro": "2024-09-19T20:10:15.297226Z", "iso8601": "2024-09-19T20:10:15Z", "iso8601_basic": "20240919T161015297226", "iso8601_basic_short": "20240919T161015", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.41, "5m": 0.36, "15m": 0.16}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.10.210 50220 10.31.12.75 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.10.210 50220 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2714, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 825, "free": 2714}, "nocache": {"free": 3301, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec276857-e760-f0b2-0e35-2b06796fb60c", "ansible_product_uuid": "ec276857-e760-f0b2-0e35-2b06796fb60c", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 226, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263481683968, "block_size": 4096, "block_total": 65533179, "block_available": 64326583, "block_used": 1206596, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:c8:82:e3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.75", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fec8:82e3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.75", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:c8:82:e3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.75"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fec8:82e3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.75", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fec8:82e3"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.12.75 closed. 8221 1726776615.92511: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8221 1726776615.92538: _low_level_execute_command(): starting 8221 1726776615.92545: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776614.3495598-8221-182168403501246/ > /dev/null 2>&1 && sleep 0' 8221 1726776615.95185: stderr chunk (state=2): >>><<< 8221 1726776615.95194: stdout chunk (state=2): >>><<< 8221 1726776615.95211: _low_level_execute_command() done: rc=0, stdout=, stderr= 8221 1726776615.95219: handler run complete 8221 1726776615.95336: variable 'ansible_facts' from source: unknown 8221 1726776615.95433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8221 1726776615.95742: variable 'ansible_facts' from source: unknown 8221 1726776615.95819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8221 1726776615.95941: attempt loop complete, returning result 8221 1726776615.95948: _execute() done 8221 1726776615.95951: dumping result to json 8221 1726776615.95980: done dumping result, returning 8221 1726776615.95989: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [120fa90a-8a95-cec2-986e-00000000002f] 8221 1726776615.95996: sending task result for task 120fa90a-8a95-cec2-986e-00000000002f 8221 1726776615.96185: done sending task result for task 120fa90a-8a95-cec2-986e-00000000002f 8221 1726776615.96190: WORKER PROCESS EXITING ok: [managed_node2] 8218 1726776615.96973: no more pending results, returning what we have 8218 1726776615.96977: results queue empty 8218 1726776615.96977: checking for any_errors_fatal 8218 1726776615.96979: done checking for any_errors_fatal 8218 1726776615.96980: checking for max_fail_percentage 8218 1726776615.96981: done checking for max_fail_percentage 8218 1726776615.96982: checking to see if all hosts have failed and the running result is not ok 8218 1726776615.96982: done checking to see if all hosts have failed 8218 1726776615.96983: getting the remaining hosts for this loop 8218 1726776615.96984: done getting the remaining hosts for this loop 8218 1726776615.96988: getting the next task for host managed_node2 8218 1726776615.96993: done getting next task for host managed_node2 8218 1726776615.96995: ^ task is: TASK: meta (flush_handlers) 8218 1726776615.96996: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776615.97000: getting variables 8218 1726776615.97001: in VariableManager get_vars() 8218 1726776615.97025: Calling all_inventory to load vars for managed_node2 8218 1726776615.97027: Calling groups_inventory to load vars for managed_node2 8218 1726776615.97032: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776615.97041: Calling all_plugins_play to load vars for managed_node2 8218 1726776615.97044: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776615.97047: Calling groups_plugins_play to load vars for managed_node2 8218 1726776615.97216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776615.97408: done with get_vars() 8218 1726776615.97419: done getting variables 8218 1726776615.97487: in VariableManager get_vars() 8218 1726776615.97497: Calling all_inventory to load vars for managed_node2 8218 1726776615.97499: Calling groups_inventory to load vars for managed_node2 8218 1726776615.97501: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776615.97506: Calling all_plugins_play to load vars for managed_node2 8218 1726776615.97508: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776615.97511: Calling groups_plugins_play to load vars for managed_node2 8218 1726776615.97642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776615.97814: done with get_vars() 8218 1726776615.97827: done queuing things up, now waiting for results queue to drain 8218 1726776615.97836: results queue empty 8218 1726776615.97837: checking for any_errors_fatal 8218 1726776615.97840: done checking for any_errors_fatal 8218 1726776615.97841: checking for max_fail_percentage 8218 1726776615.97842: done checking for max_fail_percentage 8218 1726776615.97842: checking to see if all hosts have failed and the running result is not ok 8218 1726776615.97843: done checking to see if all hosts have failed 8218 1726776615.97843: getting the remaining hosts for this loop 8218 1726776615.97844: done getting the remaining hosts for this loop 8218 1726776615.97847: getting the next task for host managed_node2 8218 1726776615.97852: done getting next task for host managed_node2 8218 1726776615.97854: ^ task is: TASK: Check if system is ostree 8218 1726776615.97856: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776615.97858: getting variables 8218 1726776615.97859: in VariableManager get_vars() 8218 1726776615.97867: Calling all_inventory to load vars for managed_node2 8218 1726776615.97869: Calling groups_inventory to load vars for managed_node2 8218 1726776615.97871: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776615.97875: Calling all_plugins_play to load vars for managed_node2 8218 1726776615.97876: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776615.97879: Calling groups_plugins_play to load vars for managed_node2 8218 1726776615.98001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776615.98167: done with get_vars() 8218 1726776615.98175: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:12 Thursday 19 September 2024 16:10:15 -0400 (0:00:01.792) 0:00:01.813 **** 8218 1726776615.98248: entering _queue_task() for managed_node2/stat 8218 1726776615.98478: worker is 1 (out of 1 available) 8218 1726776615.98490: exiting _queue_task() for managed_node2/stat 8218 1726776615.98500: done queuing things up, now waiting for results queue to drain 8218 1726776615.98502: waiting for pending results... 8287 1726776615.98681: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 8287 1726776615.98800: in run() - task 120fa90a-8a95-cec2-986e-000000000007 8287 1726776615.98818: variable 'ansible_search_path' from source: unknown 8287 1726776615.98852: calling self._execute() 8287 1726776615.98916: variable 'ansible_host' from source: host vars for 'managed_node2' 8287 1726776615.98926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8287 1726776615.98938: variable 'omit' from source: magic vars 8287 1726776615.99356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8287 1726776615.99616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8287 1726776615.99660: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8287 1726776615.99695: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8287 1726776615.99730: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8287 1726776615.99800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8287 1726776615.99824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8287 1726776615.99866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8287 1726776615.99892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8287 1726776616.00004: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8287 1726776616.00012: variable 'omit' from source: magic vars 8287 1726776616.00047: variable 'omit' from source: magic vars 8287 1726776616.00080: variable 'omit' from source: magic vars 8287 1726776616.00104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8287 1726776616.00131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8287 1726776616.00149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8287 1726776616.00165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8287 1726776616.00175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8287 1726776616.00201: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8287 1726776616.00207: variable 'ansible_host' from source: host vars for 'managed_node2' 8287 1726776616.00211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8287 1726776616.00300: Set connection var ansible_connection to ssh 8287 1726776616.00308: Set connection var ansible_pipelining to False 8287 1726776616.00314: Set connection var ansible_timeout to 10 8287 1726776616.00321: Set connection var ansible_module_compression to ZIP_DEFLATED 8287 1726776616.00327: Set connection var ansible_shell_type to sh 8287 1726776616.00334: Set connection var ansible_shell_executable to /bin/sh 8287 1726776616.00353: variable 'ansible_shell_executable' from source: unknown 8287 1726776616.00359: variable 'ansible_connection' from source: unknown 8287 1726776616.00362: variable 'ansible_module_compression' from source: unknown 8287 1726776616.00365: variable 'ansible_shell_type' from source: unknown 8287 1726776616.00368: variable 'ansible_shell_executable' from source: unknown 8287 1726776616.00371: variable 'ansible_host' from source: host vars for 'managed_node2' 8287 1726776616.00374: variable 'ansible_pipelining' from source: unknown 8287 1726776616.00377: variable 'ansible_timeout' from source: unknown 8287 1726776616.00380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8287 1726776616.00501: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8287 1726776616.00513: variable 'omit' from source: magic vars 8287 1726776616.00519: starting attempt loop 8287 1726776616.00522: running the handler 8287 1726776616.00535: _low_level_execute_command(): starting 8287 1726776616.00543: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8287 1726776616.03140: stdout chunk (state=2): >>>/root <<< 8287 1726776616.03413: stderr chunk (state=3): >>><<< 8287 1726776616.03420: stdout chunk (state=3): >>><<< 8287 1726776616.03443: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8287 1726776616.03460: _low_level_execute_command(): starting 8287 1726776616.03468: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455 `" && echo ansible-tmp-1726776616.0345228-8287-157071975706455="` echo /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455 `" ) && sleep 0' 8287 1726776616.06195: stdout chunk (state=2): >>>ansible-tmp-1726776616.0345228-8287-157071975706455=/root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455 <<< 8287 1726776616.06340: stderr chunk (state=3): >>><<< 8287 1726776616.06347: stdout chunk (state=3): >>><<< 8287 1726776616.06366: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776616.0345228-8287-157071975706455=/root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455 , stderr= 8287 1726776616.06408: variable 'ansible_module_compression' from source: unknown 8287 1726776616.06471: ANSIBALLZ: Using lock for stat 8287 1726776616.06477: ANSIBALLZ: Acquiring lock 8287 1726776616.06481: ANSIBALLZ: Lock acquired: 140571206407264 8287 1726776616.06485: ANSIBALLZ: Creating module 8287 1726776616.19946: ANSIBALLZ: Writing module into payload 8287 1726776616.20068: ANSIBALLZ: Writing module 8287 1726776616.20090: ANSIBALLZ: Renaming module 8287 1726776616.20097: ANSIBALLZ: Done creating module 8287 1726776616.20114: variable 'ansible_facts' from source: unknown 8287 1726776616.20207: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455/AnsiballZ_stat.py 8287 1726776616.20698: Sending initial data 8287 1726776616.20706: Sent initial data (151 bytes) 8287 1726776616.23494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpswju3pog /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455/AnsiballZ_stat.py <<< 8287 1726776616.24931: stderr chunk (state=3): >>><<< 8287 1726776616.24940: stdout chunk (state=3): >>><<< 8287 1726776616.24964: done transferring module to remote 8287 1726776616.24976: _low_level_execute_command(): starting 8287 1726776616.24983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455/ /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455/AnsiballZ_stat.py && sleep 0' 8287 1726776616.27616: stderr chunk (state=2): >>><<< 8287 1726776616.27625: stdout chunk (state=2): >>><<< 8287 1726776616.27643: _low_level_execute_command() done: rc=0, stdout=, stderr= 8287 1726776616.27648: _low_level_execute_command(): starting 8287 1726776616.27653: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455/AnsiballZ_stat.py && sleep 0' 8287 1726776616.43019: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8287 1726776616.44184: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8287 1726776616.44195: stdout chunk (state=3): >>><<< 8287 1726776616.44206: stderr chunk (state=3): >>><<< 8287 1726776616.44219: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 8287 1726776616.44267: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8287 1726776616.44281: _low_level_execute_command(): starting 8287 1726776616.44287: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776616.0345228-8287-157071975706455/ > /dev/null 2>&1 && sleep 0' 8287 1726776616.46986: stderr chunk (state=2): >>><<< 8287 1726776616.46995: stdout chunk (state=2): >>><<< 8287 1726776616.47018: _low_level_execute_command() done: rc=0, stdout=, stderr= 8287 1726776616.47027: handler run complete 8287 1726776616.47045: attempt loop complete, returning result 8287 1726776616.47049: _execute() done 8287 1726776616.47052: dumping result to json 8287 1726776616.47056: done dumping result, returning 8287 1726776616.47063: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [120fa90a-8a95-cec2-986e-000000000007] 8287 1726776616.47076: sending task result for task 120fa90a-8a95-cec2-986e-000000000007 8287 1726776616.47107: done sending task result for task 120fa90a-8a95-cec2-986e-000000000007 8287 1726776616.47110: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8218 1726776616.47258: no more pending results, returning what we have 8218 1726776616.47262: results queue empty 8218 1726776616.47262: checking for any_errors_fatal 8218 1726776616.47264: done checking for any_errors_fatal 8218 1726776616.47265: checking for max_fail_percentage 8218 1726776616.47266: done checking for max_fail_percentage 8218 1726776616.47267: checking to see if all hosts have failed and the running result is not ok 8218 1726776616.47268: done checking to see if all hosts have failed 8218 1726776616.47268: getting the remaining hosts for this loop 8218 1726776616.47269: done getting the remaining hosts for this loop 8218 1726776616.47273: getting the next task for host managed_node2 8218 1726776616.47278: done getting next task for host managed_node2 8218 1726776616.47280: ^ task is: TASK: Set flag to indicate system is ostree 8218 1726776616.47282: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776616.47285: getting variables 8218 1726776616.47286: in VariableManager get_vars() 8218 1726776616.47311: Calling all_inventory to load vars for managed_node2 8218 1726776616.47314: Calling groups_inventory to load vars for managed_node2 8218 1726776616.47317: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776616.47325: Calling all_plugins_play to load vars for managed_node2 8218 1726776616.47327: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776616.47332: Calling groups_plugins_play to load vars for managed_node2 8218 1726776616.47443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776616.47557: done with get_vars() 8218 1726776616.47565: done getting variables 8218 1726776616.47631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:17 Thursday 19 September 2024 16:10:16 -0400 (0:00:00.494) 0:00:02.307 **** 8218 1726776616.47653: entering _queue_task() for managed_node2/set_fact 8218 1726776616.47654: Creating lock for set_fact 8218 1726776616.47815: worker is 1 (out of 1 available) 8218 1726776616.47830: exiting _queue_task() for managed_node2/set_fact 8218 1726776616.47842: done queuing things up, now waiting for results queue to drain 8218 1726776616.47844: waiting for pending results... 8302 1726776616.47953: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 8302 1726776616.48048: in run() - task 120fa90a-8a95-cec2-986e-000000000008 8302 1726776616.48069: variable 'ansible_search_path' from source: unknown 8302 1726776616.48100: calling self._execute() 8302 1726776616.48159: variable 'ansible_host' from source: host vars for 'managed_node2' 8302 1726776616.48168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8302 1726776616.48175: variable 'omit' from source: magic vars 8302 1726776616.48514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8302 1726776616.48725: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8302 1726776616.48764: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8302 1726776616.48793: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8302 1726776616.48820: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8302 1726776616.48907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8302 1726776616.48932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8302 1726776616.48954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8302 1726776616.48976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8302 1726776616.49080: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8302 1726776616.49088: variable 'omit' from source: magic vars 8302 1726776616.49113: variable 'omit' from source: magic vars 8302 1726776616.49190: variable '__ostree_booted_stat' from source: set_fact 8302 1726776616.49227: variable 'omit' from source: magic vars 8302 1726776616.49249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8302 1726776616.49269: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8302 1726776616.49285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8302 1726776616.49298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8302 1726776616.49307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8302 1726776616.49332: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8302 1726776616.49338: variable 'ansible_host' from source: host vars for 'managed_node2' 8302 1726776616.49343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8302 1726776616.49402: Set connection var ansible_connection to ssh 8302 1726776616.49409: Set connection var ansible_pipelining to False 8302 1726776616.49416: Set connection var ansible_timeout to 10 8302 1726776616.49423: Set connection var ansible_module_compression to ZIP_DEFLATED 8302 1726776616.49428: Set connection var ansible_shell_type to sh 8302 1726776616.49435: Set connection var ansible_shell_executable to /bin/sh 8302 1726776616.49449: variable 'ansible_shell_executable' from source: unknown 8302 1726776616.49451: variable 'ansible_connection' from source: unknown 8302 1726776616.49453: variable 'ansible_module_compression' from source: unknown 8302 1726776616.49455: variable 'ansible_shell_type' from source: unknown 8302 1726776616.49457: variable 'ansible_shell_executable' from source: unknown 8302 1726776616.49460: variable 'ansible_host' from source: host vars for 'managed_node2' 8302 1726776616.49463: variable 'ansible_pipelining' from source: unknown 8302 1726776616.49464: variable 'ansible_timeout' from source: unknown 8302 1726776616.49466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8302 1726776616.49520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8302 1726776616.49528: variable 'omit' from source: magic vars 8302 1726776616.49534: starting attempt loop 8302 1726776616.49536: running the handler 8302 1726776616.49544: handler run complete 8302 1726776616.49551: attempt loop complete, returning result 8302 1726776616.49553: _execute() done 8302 1726776616.49555: dumping result to json 8302 1726776616.49557: done dumping result, returning 8302 1726776616.49562: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [120fa90a-8a95-cec2-986e-000000000008] 8302 1726776616.49565: sending task result for task 120fa90a-8a95-cec2-986e-000000000008 8302 1726776616.49582: done sending task result for task 120fa90a-8a95-cec2-986e-000000000008 8302 1726776616.49584: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_is_ostree": false }, "changed": false } 8218 1726776616.49813: no more pending results, returning what we have 8218 1726776616.49815: results queue empty 8218 1726776616.49815: checking for any_errors_fatal 8218 1726776616.49820: done checking for any_errors_fatal 8218 1726776616.49821: checking for max_fail_percentage 8218 1726776616.49822: done checking for max_fail_percentage 8218 1726776616.49822: checking to see if all hosts have failed and the running result is not ok 8218 1726776616.49823: done checking to see if all hosts have failed 8218 1726776616.49823: getting the remaining hosts for this loop 8218 1726776616.49824: done getting the remaining hosts for this loop 8218 1726776616.49826: getting the next task for host managed_node2 8218 1726776616.49833: done getting next task for host managed_node2 8218 1726776616.49834: ^ task is: TASK: Ensure required packages are installed 8218 1726776616.49835: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776616.49837: getting variables 8218 1726776616.49838: in VariableManager get_vars() 8218 1726776616.49854: Calling all_inventory to load vars for managed_node2 8218 1726776616.49858: Calling groups_inventory to load vars for managed_node2 8218 1726776616.49860: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776616.49868: Calling all_plugins_play to load vars for managed_node2 8218 1726776616.49870: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776616.49872: Calling groups_plugins_play to load vars for managed_node2 8218 1726776616.49987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776616.50090: done with get_vars() 8218 1726776616.50097: done getting variables 8218 1726776616.50164: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure required packages are installed] ********************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:22 Thursday 19 September 2024 16:10:16 -0400 (0:00:00.025) 0:00:02.332 **** 8218 1726776616.50184: entering _queue_task() for managed_node2/package 8218 1726776616.50185: Creating lock for package 8218 1726776616.50327: worker is 1 (out of 1 available) 8218 1726776616.50342: exiting _queue_task() for managed_node2/package 8218 1726776616.50352: done queuing things up, now waiting for results queue to drain 8218 1726776616.50353: waiting for pending results... 8303 1726776616.50437: running TaskExecutor() for managed_node2/TASK: Ensure required packages are installed 8303 1726776616.50518: in run() - task 120fa90a-8a95-cec2-986e-000000000009 8303 1726776616.50533: variable 'ansible_search_path' from source: unknown 8303 1726776616.50561: calling self._execute() 8303 1726776616.50609: variable 'ansible_host' from source: host vars for 'managed_node2' 8303 1726776616.50617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8303 1726776616.50626: variable 'omit' from source: magic vars 8303 1726776616.50694: variable 'omit' from source: magic vars 8303 1726776616.50717: variable 'omit' from source: magic vars 8303 1726776616.50976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8303 1726776616.52451: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8303 1726776616.52497: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8303 1726776616.52525: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8303 1726776616.52562: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8303 1726776616.52583: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8303 1726776616.52651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8303 1726776616.52672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8303 1726776616.52692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8303 1726776616.52719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8303 1726776616.52733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8303 1726776616.52811: variable '__kernel_settings_is_ostree' from source: set_fact 8303 1726776616.52819: variable 'omit' from source: magic vars 8303 1726776616.52849: variable 'omit' from source: magic vars 8303 1726776616.52869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8303 1726776616.52888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8303 1726776616.52903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8303 1726776616.52916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8303 1726776616.52925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8303 1726776616.52951: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8303 1726776616.52956: variable 'ansible_host' from source: host vars for 'managed_node2' 8303 1726776616.52960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8303 1726776616.53021: Set connection var ansible_connection to ssh 8303 1726776616.53038: Set connection var ansible_pipelining to False 8303 1726776616.53049: Set connection var ansible_timeout to 10 8303 1726776616.53059: Set connection var ansible_module_compression to ZIP_DEFLATED 8303 1726776616.53067: Set connection var ansible_shell_type to sh 8303 1726776616.53071: Set connection var ansible_shell_executable to /bin/sh 8303 1726776616.53084: variable 'ansible_shell_executable' from source: unknown 8303 1726776616.53087: variable 'ansible_connection' from source: unknown 8303 1726776616.53089: variable 'ansible_module_compression' from source: unknown 8303 1726776616.53091: variable 'ansible_shell_type' from source: unknown 8303 1726776616.53092: variable 'ansible_shell_executable' from source: unknown 8303 1726776616.53094: variable 'ansible_host' from source: host vars for 'managed_node2' 8303 1726776616.53096: variable 'ansible_pipelining' from source: unknown 8303 1726776616.53098: variable 'ansible_timeout' from source: unknown 8303 1726776616.53100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8303 1726776616.53163: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8303 1726776616.53175: variable 'omit' from source: magic vars 8303 1726776616.53181: starting attempt loop 8303 1726776616.53185: running the handler 8303 1726776616.53262: variable 'ansible_facts' from source: unknown 8303 1726776616.53371: _low_level_execute_command(): starting 8303 1726776616.53380: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8303 1726776616.55949: stdout chunk (state=2): >>>/root <<< 8303 1726776616.56074: stderr chunk (state=3): >>><<< 8303 1726776616.56080: stdout chunk (state=3): >>><<< 8303 1726776616.56095: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8303 1726776616.56104: _low_level_execute_command(): starting 8303 1726776616.56108: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514 `" && echo ansible-tmp-1726776616.5610018-8303-198285962131514="` echo /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514 `" ) && sleep 0' 8303 1726776616.59519: stdout chunk (state=2): >>>ansible-tmp-1726776616.5610018-8303-198285962131514=/root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514 <<< 8303 1726776616.59661: stderr chunk (state=3): >>><<< 8303 1726776616.59667: stdout chunk (state=3): >>><<< 8303 1726776616.59683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776616.5610018-8303-198285962131514=/root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514 , stderr= 8303 1726776616.59708: variable 'ansible_module_compression' from source: unknown 8303 1726776616.59754: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8303 1726776616.59762: ANSIBALLZ: Acquiring lock 8303 1726776616.59766: ANSIBALLZ: Lock acquired: 140571206407024 8303 1726776616.59770: ANSIBALLZ: Creating module 8303 1726776616.74766: ANSIBALLZ: Writing module into payload 8303 1726776616.75043: ANSIBALLZ: Writing module 8303 1726776616.75070: ANSIBALLZ: Renaming module 8303 1726776616.75079: ANSIBALLZ: Done creating module 8303 1726776616.75097: variable 'ansible_facts' from source: unknown 8303 1726776616.75214: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514/AnsiballZ_dnf.py 8303 1726776616.75706: Sending initial data 8303 1726776616.75714: Sent initial data (150 bytes) 8303 1726776616.78265: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpyon7c7ll /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514/AnsiballZ_dnf.py <<< 8303 1726776616.80064: stderr chunk (state=3): >>><<< 8303 1726776616.80072: stdout chunk (state=3): >>><<< 8303 1726776616.80095: done transferring module to remote 8303 1726776616.80110: _low_level_execute_command(): starting 8303 1726776616.80117: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514/ /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514/AnsiballZ_dnf.py && sleep 0' 8303 1726776616.83089: stderr chunk (state=2): >>><<< 8303 1726776616.83101: stdout chunk (state=2): >>><<< 8303 1726776616.83119: _low_level_execute_command() done: rc=0, stdout=, stderr= 8303 1726776616.83125: _low_level_execute_command(): starting 8303 1726776616.83132: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514/AnsiballZ_dnf.py && sleep 0' 8303 1726776621.77111: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "procps-ng"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8303 1726776621.85574: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8303 1726776621.85620: stderr chunk (state=3): >>><<< 8303 1726776621.85628: stdout chunk (state=3): >>><<< 8303 1726776621.85644: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "procps-ng"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8303 1726776621.85676: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'procps-ng'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8303 1726776621.85682: _low_level_execute_command(): starting 8303 1726776621.85686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776616.5610018-8303-198285962131514/ > /dev/null 2>&1 && sleep 0' 8303 1726776621.88101: stderr chunk (state=2): >>><<< 8303 1726776621.88109: stdout chunk (state=2): >>><<< 8303 1726776621.88125: _low_level_execute_command() done: rc=0, stdout=, stderr= 8303 1726776621.88134: handler run complete 8303 1726776621.88160: attempt loop complete, returning result 8303 1726776621.88163: _execute() done 8303 1726776621.88166: dumping result to json 8303 1726776621.88172: done dumping result, returning 8303 1726776621.88179: done running TaskExecutor() for managed_node2/TASK: Ensure required packages are installed [120fa90a-8a95-cec2-986e-000000000009] 8303 1726776621.88184: sending task result for task 120fa90a-8a95-cec2-986e-000000000009 8303 1726776621.88215: done sending task result for task 120fa90a-8a95-cec2-986e-000000000009 8303 1726776621.88218: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8218 1726776621.88399: no more pending results, returning what we have 8218 1726776621.88404: results queue empty 8218 1726776621.88404: checking for any_errors_fatal 8218 1726776621.88408: done checking for any_errors_fatal 8218 1726776621.88409: checking for max_fail_percentage 8218 1726776621.88410: done checking for max_fail_percentage 8218 1726776621.88411: checking to see if all hosts have failed and the running result is not ok 8218 1726776621.88412: done checking to see if all hosts have failed 8218 1726776621.88412: getting the remaining hosts for this loop 8218 1726776621.88413: done getting the remaining hosts for this loop 8218 1726776621.88417: getting the next task for host managed_node2 8218 1726776621.88421: done getting next task for host managed_node2 8218 1726776621.88423: ^ task is: TASK: See if tuned has a profile subdir 8218 1726776621.88424: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776621.88426: getting variables 8218 1726776621.88428: in VariableManager get_vars() 8218 1726776621.88453: Calling all_inventory to load vars for managed_node2 8218 1726776621.88456: Calling groups_inventory to load vars for managed_node2 8218 1726776621.88458: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776621.88470: Calling all_plugins_play to load vars for managed_node2 8218 1726776621.88472: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776621.88475: Calling groups_plugins_play to load vars for managed_node2 8218 1726776621.88591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776621.88696: done with get_vars() 8218 1726776621.88705: done getting variables TASK [See if tuned has a profile subdir] *************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:28 Thursday 19 September 2024 16:10:21 -0400 (0:00:05.385) 0:00:07.718 **** 8218 1726776621.88772: entering _queue_task() for managed_node2/stat 8218 1726776621.88923: worker is 1 (out of 1 available) 8218 1726776621.88936: exiting _queue_task() for managed_node2/stat 8218 1726776621.88947: done queuing things up, now waiting for results queue to drain 8218 1726776621.88949: waiting for pending results... 8392 1726776621.89042: running TaskExecutor() for managed_node2/TASK: See if tuned has a profile subdir 8392 1726776621.89127: in run() - task 120fa90a-8a95-cec2-986e-00000000000a 8392 1726776621.89145: variable 'ansible_search_path' from source: unknown 8392 1726776621.89173: calling self._execute() 8392 1726776621.89225: variable 'ansible_host' from source: host vars for 'managed_node2' 8392 1726776621.89235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8392 1726776621.89243: variable 'omit' from source: magic vars 8392 1726776621.89313: variable 'omit' from source: magic vars 8392 1726776621.89340: variable 'omit' from source: magic vars 8392 1726776621.89366: variable 'omit' from source: magic vars 8392 1726776621.89398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8392 1726776621.89469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8392 1726776621.89488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8392 1726776621.89503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8392 1726776621.89514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8392 1726776621.89539: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8392 1726776621.89544: variable 'ansible_host' from source: host vars for 'managed_node2' 8392 1726776621.89549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8392 1726776621.89616: Set connection var ansible_connection to ssh 8392 1726776621.89624: Set connection var ansible_pipelining to False 8392 1726776621.89632: Set connection var ansible_timeout to 10 8392 1726776621.89640: Set connection var ansible_module_compression to ZIP_DEFLATED 8392 1726776621.89645: Set connection var ansible_shell_type to sh 8392 1726776621.89651: Set connection var ansible_shell_executable to /bin/sh 8392 1726776621.89666: variable 'ansible_shell_executable' from source: unknown 8392 1726776621.89670: variable 'ansible_connection' from source: unknown 8392 1726776621.89674: variable 'ansible_module_compression' from source: unknown 8392 1726776621.89677: variable 'ansible_shell_type' from source: unknown 8392 1726776621.89681: variable 'ansible_shell_executable' from source: unknown 8392 1726776621.89684: variable 'ansible_host' from source: host vars for 'managed_node2' 8392 1726776621.89688: variable 'ansible_pipelining' from source: unknown 8392 1726776621.89691: variable 'ansible_timeout' from source: unknown 8392 1726776621.89694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8392 1726776621.89831: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8392 1726776621.89844: variable 'omit' from source: magic vars 8392 1726776621.89849: starting attempt loop 8392 1726776621.89853: running the handler 8392 1726776621.89864: _low_level_execute_command(): starting 8392 1726776621.89872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8392 1726776621.92738: stdout chunk (state=2): >>>/root <<< 8392 1726776621.92865: stderr chunk (state=3): >>><<< 8392 1726776621.92872: stdout chunk (state=3): >>><<< 8392 1726776621.92888: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8392 1726776621.92899: _low_level_execute_command(): starting 8392 1726776621.92905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362 `" && echo ansible-tmp-1726776621.9289517-8392-226783951740362="` echo /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362 `" ) && sleep 0' 8392 1726776621.95912: stdout chunk (state=2): >>>ansible-tmp-1726776621.9289517-8392-226783951740362=/root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362 <<< 8392 1726776621.96044: stderr chunk (state=3): >>><<< 8392 1726776621.96050: stdout chunk (state=3): >>><<< 8392 1726776621.96064: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776621.9289517-8392-226783951740362=/root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362 , stderr= 8392 1726776621.96110: variable 'ansible_module_compression' from source: unknown 8392 1726776621.96158: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8392 1726776621.96184: variable 'ansible_facts' from source: unknown 8392 1726776621.96247: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362/AnsiballZ_stat.py 8392 1726776621.96340: Sending initial data 8392 1726776621.96346: Sent initial data (151 bytes) 8392 1726776621.98848: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpo1cfbwip /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362/AnsiballZ_stat.py <<< 8392 1726776622.00039: stderr chunk (state=3): >>><<< 8392 1726776622.00047: stdout chunk (state=3): >>><<< 8392 1726776622.00069: done transferring module to remote 8392 1726776622.00081: _low_level_execute_command(): starting 8392 1726776622.00087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362/ /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362/AnsiballZ_stat.py && sleep 0' 8392 1726776622.03034: stderr chunk (state=2): >>><<< 8392 1726776622.03043: stdout chunk (state=2): >>><<< 8392 1726776622.03058: _low_level_execute_command() done: rc=0, stdout=, stderr= 8392 1726776622.03061: _low_level_execute_command(): starting 8392 1726776622.03068: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362/AnsiballZ_stat.py && sleep 0' 8392 1726776622.18369: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8392 1726776622.19431: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8392 1726776622.19478: stderr chunk (state=3): >>><<< 8392 1726776622.19486: stdout chunk (state=3): >>><<< 8392 1726776622.19501: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 8392 1726776622.19541: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8392 1726776622.19553: _low_level_execute_command(): starting 8392 1726776622.19559: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776621.9289517-8392-226783951740362/ > /dev/null 2>&1 && sleep 0' 8392 1726776622.21975: stderr chunk (state=2): >>><<< 8392 1726776622.21983: stdout chunk (state=2): >>><<< 8392 1726776622.22000: _low_level_execute_command() done: rc=0, stdout=, stderr= 8392 1726776622.22006: handler run complete 8392 1726776622.22022: attempt loop complete, returning result 8392 1726776622.22025: _execute() done 8392 1726776622.22030: dumping result to json 8392 1726776622.22034: done dumping result, returning 8392 1726776622.22041: done running TaskExecutor() for managed_node2/TASK: See if tuned has a profile subdir [120fa90a-8a95-cec2-986e-00000000000a] 8392 1726776622.22046: sending task result for task 120fa90a-8a95-cec2-986e-00000000000a 8392 1726776622.22076: done sending task result for task 120fa90a-8a95-cec2-986e-00000000000a 8392 1726776622.22080: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8218 1726776622.22208: no more pending results, returning what we have 8218 1726776622.22211: results queue empty 8218 1726776622.22211: checking for any_errors_fatal 8218 1726776622.22217: done checking for any_errors_fatal 8218 1726776622.22217: checking for max_fail_percentage 8218 1726776622.22219: done checking for max_fail_percentage 8218 1726776622.22219: checking to see if all hosts have failed and the running result is not ok 8218 1726776622.22220: done checking to see if all hosts have failed 8218 1726776622.22221: getting the remaining hosts for this loop 8218 1726776622.22222: done getting the remaining hosts for this loop 8218 1726776622.22225: getting the next task for host managed_node2 8218 1726776622.22231: done getting next task for host managed_node2 8218 1726776622.22233: ^ task is: TASK: Set profile dir 8218 1726776622.22234: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776622.22237: getting variables 8218 1726776622.22238: in VariableManager get_vars() 8218 1726776622.22266: Calling all_inventory to load vars for managed_node2 8218 1726776622.22269: Calling groups_inventory to load vars for managed_node2 8218 1726776622.22272: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776622.22281: Calling all_plugins_play to load vars for managed_node2 8218 1726776622.22283: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776622.22286: Calling groups_plugins_play to load vars for managed_node2 8218 1726776622.22415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776622.22520: done with get_vars() 8218 1726776622.22527: done getting variables 8218 1726776622.22573: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set profile dir] ********************************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:33 Thursday 19 September 2024 16:10:22 -0400 (0:00:00.338) 0:00:08.056 **** 8218 1726776622.22592: entering _queue_task() for managed_node2/set_fact 8218 1726776622.22743: worker is 1 (out of 1 available) 8218 1726776622.22755: exiting _queue_task() for managed_node2/set_fact 8218 1726776622.22768: done queuing things up, now waiting for results queue to drain 8218 1726776622.22770: waiting for pending results... 8415 1726776622.22859: running TaskExecutor() for managed_node2/TASK: Set profile dir 8415 1726776622.22948: in run() - task 120fa90a-8a95-cec2-986e-00000000000b 8415 1726776622.22963: variable 'ansible_search_path' from source: unknown 8415 1726776622.22990: calling self._execute() 8415 1726776622.23045: variable 'ansible_host' from source: host vars for 'managed_node2' 8415 1726776622.23055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8415 1726776622.23063: variable 'omit' from source: magic vars 8415 1726776622.23135: variable 'omit' from source: magic vars 8415 1726776622.23161: variable 'omit' from source: magic vars 8415 1726776622.23395: variable '__dir' from source: task vars 8415 1726776622.23484: variable '__tuned_profiles' from source: set_fact 8415 1726776622.23514: variable 'omit' from source: magic vars 8415 1726776622.23547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8415 1726776622.23571: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8415 1726776622.23588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8415 1726776622.23600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8415 1726776622.23609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8415 1726776622.23630: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8415 1726776622.23635: variable 'ansible_host' from source: host vars for 'managed_node2' 8415 1726776622.23640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8415 1726776622.23708: Set connection var ansible_connection to ssh 8415 1726776622.23716: Set connection var ansible_pipelining to False 8415 1726776622.23723: Set connection var ansible_timeout to 10 8415 1726776622.23732: Set connection var ansible_module_compression to ZIP_DEFLATED 8415 1726776622.23737: Set connection var ansible_shell_type to sh 8415 1726776622.23743: Set connection var ansible_shell_executable to /bin/sh 8415 1726776622.23758: variable 'ansible_shell_executable' from source: unknown 8415 1726776622.23762: variable 'ansible_connection' from source: unknown 8415 1726776622.23765: variable 'ansible_module_compression' from source: unknown 8415 1726776622.23769: variable 'ansible_shell_type' from source: unknown 8415 1726776622.23772: variable 'ansible_shell_executable' from source: unknown 8415 1726776622.23775: variable 'ansible_host' from source: host vars for 'managed_node2' 8415 1726776622.23779: variable 'ansible_pipelining' from source: unknown 8415 1726776622.23782: variable 'ansible_timeout' from source: unknown 8415 1726776622.23786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8415 1726776622.23877: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8415 1726776622.23888: variable 'omit' from source: magic vars 8415 1726776622.23893: starting attempt loop 8415 1726776622.23897: running the handler 8415 1726776622.23908: handler run complete 8415 1726776622.23918: attempt loop complete, returning result 8415 1726776622.23921: _execute() done 8415 1726776622.23924: dumping result to json 8415 1726776622.23928: done dumping result, returning 8415 1726776622.23935: done running TaskExecutor() for managed_node2/TASK: Set profile dir [120fa90a-8a95-cec2-986e-00000000000b] 8415 1726776622.23941: sending task result for task 120fa90a-8a95-cec2-986e-00000000000b 8415 1726776622.23959: done sending task result for task 120fa90a-8a95-cec2-986e-00000000000b 8415 1726776622.23962: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__profile_dir": "/etc/tuned/kernel_settings" }, "changed": false } 8218 1726776622.24087: no more pending results, returning what we have 8218 1726776622.24089: results queue empty 8218 1726776622.24090: checking for any_errors_fatal 8218 1726776622.24095: done checking for any_errors_fatal 8218 1726776622.24096: checking for max_fail_percentage 8218 1726776622.24097: done checking for max_fail_percentage 8218 1726776622.24097: checking to see if all hosts have failed and the running result is not ok 8218 1726776622.24098: done checking to see if all hosts have failed 8218 1726776622.24099: getting the remaining hosts for this loop 8218 1726776622.24100: done getting the remaining hosts for this loop 8218 1726776622.24102: getting the next task for host managed_node2 8218 1726776622.24107: done getting next task for host managed_node2 8218 1726776622.24109: ^ task is: TASK: Ensure kernel settings profile directory exists 8218 1726776622.24110: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776622.24112: getting variables 8218 1726776622.24113: in VariableManager get_vars() 8218 1726776622.24136: Calling all_inventory to load vars for managed_node2 8218 1726776622.24138: Calling groups_inventory to load vars for managed_node2 8218 1726776622.24141: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776622.24148: Calling all_plugins_play to load vars for managed_node2 8218 1726776622.24150: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776622.24151: Calling groups_plugins_play to load vars for managed_node2 8218 1726776622.24246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776622.24348: done with get_vars() 8218 1726776622.24355: done getting variables TASK [Ensure kernel settings profile directory exists] ************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:39 Thursday 19 September 2024 16:10:22 -0400 (0:00:00.018) 0:00:08.074 **** 8218 1726776622.24414: entering _queue_task() for managed_node2/file 8218 1726776622.24558: worker is 1 (out of 1 available) 8218 1726776622.24573: exiting _queue_task() for managed_node2/file 8218 1726776622.24583: done queuing things up, now waiting for results queue to drain 8218 1726776622.24585: waiting for pending results... 8416 1726776622.24670: running TaskExecutor() for managed_node2/TASK: Ensure kernel settings profile directory exists 8416 1726776622.24749: in run() - task 120fa90a-8a95-cec2-986e-00000000000c 8416 1726776622.24762: variable 'ansible_search_path' from source: unknown 8416 1726776622.24786: calling self._execute() 8416 1726776622.24838: variable 'ansible_host' from source: host vars for 'managed_node2' 8416 1726776622.24848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8416 1726776622.24857: variable 'omit' from source: magic vars 8416 1726776622.24925: variable 'omit' from source: magic vars 8416 1726776622.24948: variable 'omit' from source: magic vars 8416 1726776622.24966: variable '__profile_dir' from source: set_fact 8416 1726776622.25213: variable '__profile_dir' from source: set_fact 8416 1726776622.25239: variable 'omit' from source: magic vars 8416 1726776622.25271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8416 1726776622.25295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8416 1726776622.25313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8416 1726776622.25328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8416 1726776622.25340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8416 1726776622.25362: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8416 1726776622.25367: variable 'ansible_host' from source: host vars for 'managed_node2' 8416 1726776622.25370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8416 1726776622.25433: Set connection var ansible_connection to ssh 8416 1726776622.25441: Set connection var ansible_pipelining to False 8416 1726776622.25447: Set connection var ansible_timeout to 10 8416 1726776622.25454: Set connection var ansible_module_compression to ZIP_DEFLATED 8416 1726776622.25459: Set connection var ansible_shell_type to sh 8416 1726776622.25462: Set connection var ansible_shell_executable to /bin/sh 8416 1726776622.25475: variable 'ansible_shell_executable' from source: unknown 8416 1726776622.25478: variable 'ansible_connection' from source: unknown 8416 1726776622.25480: variable 'ansible_module_compression' from source: unknown 8416 1726776622.25482: variable 'ansible_shell_type' from source: unknown 8416 1726776622.25483: variable 'ansible_shell_executable' from source: unknown 8416 1726776622.25485: variable 'ansible_host' from source: host vars for 'managed_node2' 8416 1726776622.25487: variable 'ansible_pipelining' from source: unknown 8416 1726776622.25488: variable 'ansible_timeout' from source: unknown 8416 1726776622.25490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8416 1726776622.25635: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8416 1726776622.25646: variable 'omit' from source: magic vars 8416 1726776622.25652: starting attempt loop 8416 1726776622.25656: running the handler 8416 1726776622.25668: _low_level_execute_command(): starting 8416 1726776622.25675: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8416 1726776622.28025: stdout chunk (state=2): >>>/root <<< 8416 1726776622.28147: stderr chunk (state=3): >>><<< 8416 1726776622.28153: stdout chunk (state=3): >>><<< 8416 1726776622.28171: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8416 1726776622.28185: _low_level_execute_command(): starting 8416 1726776622.28190: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793 `" && echo ansible-tmp-1726776622.2817903-8416-222383116735793="` echo /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793 `" ) && sleep 0' 8416 1726776622.30624: stdout chunk (state=2): >>>ansible-tmp-1726776622.2817903-8416-222383116735793=/root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793 <<< 8416 1726776622.30757: stderr chunk (state=3): >>><<< 8416 1726776622.30763: stdout chunk (state=3): >>><<< 8416 1726776622.30780: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776622.2817903-8416-222383116735793=/root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793 , stderr= 8416 1726776622.30812: variable 'ansible_module_compression' from source: unknown 8416 1726776622.30855: ANSIBALLZ: Using lock for file 8416 1726776622.30860: ANSIBALLZ: Acquiring lock 8416 1726776622.30866: ANSIBALLZ: Lock acquired: 140571206408320 8416 1726776622.30871: ANSIBALLZ: Creating module 8416 1726776622.41281: ANSIBALLZ: Writing module into payload 8416 1726776622.41431: ANSIBALLZ: Writing module 8416 1726776622.41449: ANSIBALLZ: Renaming module 8416 1726776622.41459: ANSIBALLZ: Done creating module 8416 1726776622.41475: variable 'ansible_facts' from source: unknown 8416 1726776622.41534: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793/AnsiballZ_file.py 8416 1726776622.41639: Sending initial data 8416 1726776622.41647: Sent initial data (151 bytes) 8416 1726776622.44454: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmppzhyiagi /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793/AnsiballZ_file.py <<< 8416 1726776622.46000: stderr chunk (state=3): >>><<< 8416 1726776622.46007: stdout chunk (state=3): >>><<< 8416 1726776622.46026: done transferring module to remote 8416 1726776622.46041: _low_level_execute_command(): starting 8416 1726776622.46047: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793/ /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793/AnsiballZ_file.py && sleep 0' 8416 1726776622.48706: stderr chunk (state=2): >>><<< 8416 1726776622.48716: stdout chunk (state=2): >>><<< 8416 1726776622.48733: _low_level_execute_command() done: rc=0, stdout=, stderr= 8416 1726776622.48739: _low_level_execute_command(): starting 8416 1726776622.48744: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793/AnsiballZ_file.py && sleep 0' 8416 1726776622.65067: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8416 1726776622.66229: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8416 1726776622.66276: stderr chunk (state=3): >>><<< 8416 1726776622.66283: stdout chunk (state=3): >>><<< 8416 1726776622.66303: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8416 1726776622.66339: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8416 1726776622.66350: _low_level_execute_command(): starting 8416 1726776622.66355: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776622.2817903-8416-222383116735793/ > /dev/null 2>&1 && sleep 0' 8416 1726776622.69092: stderr chunk (state=2): >>><<< 8416 1726776622.69102: stdout chunk (state=2): >>><<< 8416 1726776622.69119: _low_level_execute_command() done: rc=0, stdout=, stderr= 8416 1726776622.69127: handler run complete 8416 1726776622.69158: attempt loop complete, returning result 8416 1726776622.69162: _execute() done 8416 1726776622.69168: dumping result to json 8416 1726776622.69174: done dumping result, returning 8416 1726776622.69181: done running TaskExecutor() for managed_node2/TASK: Ensure kernel settings profile directory exists [120fa90a-8a95-cec2-986e-00000000000c] 8416 1726776622.69186: sending task result for task 120fa90a-8a95-cec2-986e-00000000000c 8416 1726776622.69231: done sending task result for task 120fa90a-8a95-cec2-986e-00000000000c 8416 1726776622.69236: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "state": "directory", "uid": 0 } 8218 1726776622.69686: no more pending results, returning what we have 8218 1726776622.69690: results queue empty 8218 1726776622.69690: checking for any_errors_fatal 8218 1726776622.69696: done checking for any_errors_fatal 8218 1726776622.69696: checking for max_fail_percentage 8218 1726776622.69698: done checking for max_fail_percentage 8218 1726776622.69699: checking to see if all hosts have failed and the running result is not ok 8218 1726776622.69699: done checking to see if all hosts have failed 8218 1726776622.69700: getting the remaining hosts for this loop 8218 1726776622.69701: done getting the remaining hosts for this loop 8218 1726776622.69704: getting the next task for host managed_node2 8218 1726776622.69709: done getting next task for host managed_node2 8218 1726776622.69711: ^ task is: TASK: Generate a configuration for kernel settings 8218 1726776622.69712: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776622.69715: getting variables 8218 1726776622.69716: in VariableManager get_vars() 8218 1726776622.69745: Calling all_inventory to load vars for managed_node2 8218 1726776622.69748: Calling groups_inventory to load vars for managed_node2 8218 1726776622.69751: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776622.69760: Calling all_plugins_play to load vars for managed_node2 8218 1726776622.69763: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776622.69769: Calling groups_plugins_play to load vars for managed_node2 8218 1726776622.69979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776622.70168: done with get_vars() 8218 1726776622.70178: done getting variables 8218 1726776622.70298: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Generate a configuration for kernel settings] **************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:45 Thursday 19 September 2024 16:10:22 -0400 (0:00:00.459) 0:00:08.533 **** 8218 1726776622.70324: entering _queue_task() for managed_node2/copy 8218 1726776622.70510: worker is 1 (out of 1 available) 8218 1726776622.70523: exiting _queue_task() for managed_node2/copy 8218 1726776622.70536: done queuing things up, now waiting for results queue to drain 8218 1726776622.70538: waiting for pending results... 8433 1726776622.70730: running TaskExecutor() for managed_node2/TASK: Generate a configuration for kernel settings 8433 1726776622.70833: in run() - task 120fa90a-8a95-cec2-986e-00000000000d 8433 1726776622.70851: variable 'ansible_search_path' from source: unknown 8433 1726776622.70886: calling self._execute() 8433 1726776622.70956: variable 'ansible_host' from source: host vars for 'managed_node2' 8433 1726776622.70969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8433 1726776622.70978: variable 'omit' from source: magic vars 8433 1726776622.71073: variable 'omit' from source: magic vars 8433 1726776622.71104: variable 'omit' from source: magic vars 8433 1726776622.71388: variable '__profile_dir' from source: set_fact 8433 1726776622.71415: variable 'omit' from source: magic vars 8433 1726776622.71451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8433 1726776622.71481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8433 1726776622.71498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8433 1726776622.71512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8433 1726776622.71522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8433 1726776622.71548: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8433 1726776622.71553: variable 'ansible_host' from source: host vars for 'managed_node2' 8433 1726776622.71556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8433 1726776622.71696: Set connection var ansible_connection to ssh 8433 1726776622.71703: Set connection var ansible_pipelining to False 8433 1726776622.71709: Set connection var ansible_timeout to 10 8433 1726776622.71715: Set connection var ansible_module_compression to ZIP_DEFLATED 8433 1726776622.71720: Set connection var ansible_shell_type to sh 8433 1726776622.71725: Set connection var ansible_shell_executable to /bin/sh 8433 1726776622.71743: variable 'ansible_shell_executable' from source: unknown 8433 1726776622.71747: variable 'ansible_connection' from source: unknown 8433 1726776622.71750: variable 'ansible_module_compression' from source: unknown 8433 1726776622.71752: variable 'ansible_shell_type' from source: unknown 8433 1726776622.71755: variable 'ansible_shell_executable' from source: unknown 8433 1726776622.71757: variable 'ansible_host' from source: host vars for 'managed_node2' 8433 1726776622.71760: variable 'ansible_pipelining' from source: unknown 8433 1726776622.71762: variable 'ansible_timeout' from source: unknown 8433 1726776622.71767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8433 1726776622.71872: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8433 1726776622.71883: variable 'omit' from source: magic vars 8433 1726776622.71888: starting attempt loop 8433 1726776622.71891: running the handler 8433 1726776622.71901: _low_level_execute_command(): starting 8433 1726776622.71907: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8433 1726776622.74535: stdout chunk (state=2): >>>/root <<< 8433 1726776622.75145: stderr chunk (state=3): >>><<< 8433 1726776622.75155: stdout chunk (state=3): >>><<< 8433 1726776622.75180: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8433 1726776622.75196: _low_level_execute_command(): starting 8433 1726776622.75202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754 `" && echo ansible-tmp-1726776622.751896-8433-243985162195754="` echo /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754 `" ) && sleep 0' 8433 1726776622.78493: stdout chunk (state=2): >>>ansible-tmp-1726776622.751896-8433-243985162195754=/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754 <<< 8433 1726776622.78642: stderr chunk (state=3): >>><<< 8433 1726776622.78649: stdout chunk (state=3): >>><<< 8433 1726776622.78664: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776622.751896-8433-243985162195754=/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754 , stderr= 8433 1726776622.78679: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 8433 1726776622.78702: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf 8433 1726776622.78752: variable 'ansible_module_compression' from source: unknown 8433 1726776622.78794: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8433 1726776622.78818: variable 'ansible_facts' from source: unknown 8433 1726776622.78884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_stat.py 8433 1726776622.79297: Sending initial data 8433 1726776622.79304: Sent initial data (150 bytes) 8433 1726776622.81670: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpb1oamt3i /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_stat.py <<< 8433 1726776622.83925: stderr chunk (state=3): >>><<< 8433 1726776622.83935: stdout chunk (state=3): >>><<< 8433 1726776622.83960: done transferring module to remote 8433 1726776622.83971: _low_level_execute_command(): starting 8433 1726776622.83977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/ /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_stat.py && sleep 0' 8433 1726776622.87134: stderr chunk (state=2): >>><<< 8433 1726776622.87142: stdout chunk (state=2): >>><<< 8433 1726776622.87156: _low_level_execute_command() done: rc=0, stdout=, stderr= 8433 1726776622.87160: _low_level_execute_command(): starting 8433 1726776622.87167: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_stat.py && sleep 0' 8433 1726776623.02635: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8433 1726776623.03758: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8433 1726776623.03769: stdout chunk (state=3): >>><<< 8433 1726776623.03781: stderr chunk (state=3): >>><<< 8433 1726776623.03794: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 8433 1726776623.03825: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8433 1726776623.04032: Sending initial data 8433 1726776623.04039: Sent initial data (213 bytes) 8433 1726776623.06725: stdout chunk (state=3): >>>sftp> put /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source <<< 8433 1726776623.07835: stderr chunk (state=3): >>><<< 8433 1726776623.07846: stdout chunk (state=3): >>><<< 8433 1726776623.07864: _low_level_execute_command(): starting 8433 1726776623.07870: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/ /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source && sleep 0' 8433 1726776623.10597: stderr chunk (state=2): >>><<< 8433 1726776623.10605: stdout chunk (state=2): >>><<< 8433 1726776623.10619: _low_level_execute_command() done: rc=0, stdout=, stderr= 8433 1726776623.10641: variable 'ansible_module_compression' from source: unknown 8433 1726776623.10679: ANSIBALLZ: Using generic lock for ansible.legacy.copy 8433 1726776623.10686: ANSIBALLZ: Acquiring lock 8433 1726776623.10689: ANSIBALLZ: Lock acquired: 140571206407024 8433 1726776623.10695: ANSIBALLZ: Creating module 8433 1726776623.19878: ANSIBALLZ: Writing module into payload 8433 1726776623.20021: ANSIBALLZ: Writing module 8433 1726776623.20047: ANSIBALLZ: Renaming module 8433 1726776623.20054: ANSIBALLZ: Done creating module 8433 1726776623.20067: variable 'ansible_facts' from source: unknown 8433 1726776623.20123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_copy.py 8433 1726776623.20212: Sending initial data 8433 1726776623.20219: Sent initial data (150 bytes) 8433 1726776623.22826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpeoikt8p4 /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_copy.py <<< 8433 1726776623.23944: stderr chunk (state=3): >>><<< 8433 1726776623.23952: stdout chunk (state=3): >>><<< 8433 1726776623.23972: done transferring module to remote 8433 1726776623.23981: _low_level_execute_command(): starting 8433 1726776623.23986: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/ /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_copy.py && sleep 0' 8433 1726776623.26398: stderr chunk (state=2): >>><<< 8433 1726776623.26406: stdout chunk (state=2): >>><<< 8433 1726776623.26419: _low_level_execute_command() done: rc=0, stdout=, stderr= 8433 1726776623.26423: _low_level_execute_command(): starting 8433 1726776623.26428: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/AnsiballZ_copy.py && sleep 0' 8433 1726776623.42879: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source", "md5sum": "d5df32baf1a63528844555117ead6672", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "_original_basename": "tuned.conf", "follow": false, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8433 1726776623.44667: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8433 1726776623.44678: stdout chunk (state=3): >>><<< 8433 1726776623.44690: stderr chunk (state=3): >>><<< 8433 1726776623.44704: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source", "md5sum": "d5df32baf1a63528844555117ead6672", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "_original_basename": "tuned.conf", "follow": false, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8433 1726776623.44741: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', '_original_basename': 'tuned.conf', 'follow': False, 'checksum': '13fdc203370e2b8e7e42c13d94b671b1ac621563', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8433 1726776623.44754: _low_level_execute_command(): starting 8433 1726776623.44761: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/ > /dev/null 2>&1 && sleep 0' 8433 1726776623.47361: stderr chunk (state=2): >>><<< 8433 1726776623.47373: stdout chunk (state=2): >>><<< 8433 1726776623.47387: _low_level_execute_command() done: rc=0, stdout=, stderr= 8433 1726776623.47396: handler run complete 8433 1726776623.47414: attempt loop complete, returning result 8433 1726776623.47417: _execute() done 8433 1726776623.47420: dumping result to json 8433 1726776623.47430: done dumping result, returning 8433 1726776623.47438: done running TaskExecutor() for managed_node2/TASK: Generate a configuration for kernel settings [120fa90a-8a95-cec2-986e-00000000000d] 8433 1726776623.47444: sending task result for task 120fa90a-8a95-cec2-986e-00000000000d 8433 1726776623.47478: done sending task result for task 120fa90a-8a95-cec2-986e-00000000000d 8433 1726776623.47481: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "d5df32baf1a63528844555117ead6672", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "src": "/root/.ansible/tmp/ansible-tmp-1726776622.751896-8433-243985162195754/source", "state": "file", "uid": 0 } 8218 1726776623.47638: no more pending results, returning what we have 8218 1726776623.47641: results queue empty 8218 1726776623.47642: checking for any_errors_fatal 8218 1726776623.47648: done checking for any_errors_fatal 8218 1726776623.47648: checking for max_fail_percentage 8218 1726776623.47650: done checking for max_fail_percentage 8218 1726776623.47650: checking to see if all hosts have failed and the running result is not ok 8218 1726776623.47651: done checking to see if all hosts have failed 8218 1726776623.47652: getting the remaining hosts for this loop 8218 1726776623.47653: done getting the remaining hosts for this loop 8218 1726776623.47656: getting the next task for host managed_node2 8218 1726776623.47660: done getting next task for host managed_node2 8218 1726776623.47662: ^ task is: TASK: Ensure required services are enabled and started 8218 1726776623.47664: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776623.47668: getting variables 8218 1726776623.47670: in VariableManager get_vars() 8218 1726776623.47694: Calling all_inventory to load vars for managed_node2 8218 1726776623.47696: Calling groups_inventory to load vars for managed_node2 8218 1726776623.47699: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776623.47707: Calling all_plugins_play to load vars for managed_node2 8218 1726776623.47709: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776623.47711: Calling groups_plugins_play to load vars for managed_node2 8218 1726776623.47821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776623.47927: done with get_vars() 8218 1726776623.47936: done getting variables 8218 1726776623.48007: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure required services are enabled and started] ************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:51 Thursday 19 September 2024 16:10:23 -0400 (0:00:00.777) 0:00:09.310 **** 8218 1726776623.48027: entering _queue_task() for managed_node2/service 8218 1726776623.48030: Creating lock for service 8218 1726776623.48218: worker is 1 (out of 1 available) 8218 1726776623.48236: exiting _queue_task() for managed_node2/service 8218 1726776623.48247: done queuing things up, now waiting for results queue to drain 8218 1726776623.48248: waiting for pending results... 8465 1726776623.48345: running TaskExecutor() for managed_node2/TASK: Ensure required services are enabled and started 8465 1726776623.48438: in run() - task 120fa90a-8a95-cec2-986e-00000000000e 8465 1726776623.48454: variable 'ansible_search_path' from source: unknown 8465 1726776623.48481: calling self._execute() 8465 1726776623.48536: variable 'ansible_host' from source: host vars for 'managed_node2' 8465 1726776623.48544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8465 1726776623.48553: variable 'omit' from source: magic vars 8465 1726776623.48621: variable 'omit' from source: magic vars 8465 1726776623.48649: variable 'omit' from source: magic vars 8465 1726776623.48677: variable 'omit' from source: magic vars 8465 1726776623.48709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8465 1726776623.48736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8465 1726776623.48754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8465 1726776623.48769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8465 1726776623.48781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8465 1726776623.48805: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8465 1726776623.48811: variable 'ansible_host' from source: host vars for 'managed_node2' 8465 1726776623.48816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8465 1726776623.48887: Set connection var ansible_connection to ssh 8465 1726776623.48895: Set connection var ansible_pipelining to False 8465 1726776623.48902: Set connection var ansible_timeout to 10 8465 1726776623.48910: Set connection var ansible_module_compression to ZIP_DEFLATED 8465 1726776623.48915: Set connection var ansible_shell_type to sh 8465 1726776623.48921: Set connection var ansible_shell_executable to /bin/sh 8465 1726776623.48939: variable 'ansible_shell_executable' from source: unknown 8465 1726776623.48944: variable 'ansible_connection' from source: unknown 8465 1726776623.48948: variable 'ansible_module_compression' from source: unknown 8465 1726776623.48951: variable 'ansible_shell_type' from source: unknown 8465 1726776623.48954: variable 'ansible_shell_executable' from source: unknown 8465 1726776623.48958: variable 'ansible_host' from source: host vars for 'managed_node2' 8465 1726776623.48962: variable 'ansible_pipelining' from source: unknown 8465 1726776623.48965: variable 'ansible_timeout' from source: unknown 8465 1726776623.48969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8465 1726776623.49078: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8465 1726776623.49092: variable 'omit' from source: magic vars 8465 1726776623.49104: starting attempt loop 8465 1726776623.49111: running the handler 8465 1726776623.49373: variable 'ansible_facts' from source: unknown 8465 1726776623.49463: _low_level_execute_command(): starting 8465 1726776623.49472: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8465 1726776623.51930: stdout chunk (state=2): >>>/root <<< 8465 1726776623.52048: stderr chunk (state=3): >>><<< 8465 1726776623.52054: stdout chunk (state=3): >>><<< 8465 1726776623.52075: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8465 1726776623.52088: _low_level_execute_command(): starting 8465 1726776623.52094: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401 `" && echo ansible-tmp-1726776623.5208344-8465-208988261186401="` echo /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401 `" ) && sleep 0' 8465 1726776623.54736: stdout chunk (state=2): >>>ansible-tmp-1726776623.5208344-8465-208988261186401=/root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401 <<< 8465 1726776623.55137: stderr chunk (state=3): >>><<< 8465 1726776623.55148: stdout chunk (state=3): >>><<< 8465 1726776623.55168: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776623.5208344-8465-208988261186401=/root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401 , stderr= 8465 1726776623.55204: variable 'ansible_module_compression' from source: unknown 8465 1726776623.55265: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8465 1726776623.55271: ANSIBALLZ: Acquiring lock 8465 1726776623.55274: ANSIBALLZ: Lock acquired: 140571206407024 8465 1726776623.55278: ANSIBALLZ: Creating module 8465 1726776623.80161: ANSIBALLZ: Writing module into payload 8465 1726776623.80308: ANSIBALLZ: Writing module 8465 1726776623.80348: ANSIBALLZ: Renaming module 8465 1726776623.80357: ANSIBALLZ: Done creating module 8465 1726776623.80386: variable 'ansible_facts' from source: unknown 8465 1726776623.80544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401/AnsiballZ_systemd.py 8465 1726776623.80652: Sending initial data 8465 1726776623.80659: Sent initial data (154 bytes) 8465 1726776623.83277: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpuhfcer44 /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401/AnsiballZ_systemd.py <<< 8465 1726776623.85222: stderr chunk (state=3): >>><<< 8465 1726776623.85230: stdout chunk (state=3): >>><<< 8465 1726776623.85250: done transferring module to remote 8465 1726776623.85260: _low_level_execute_command(): starting 8465 1726776623.85265: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401/ /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401/AnsiballZ_systemd.py && sleep 0' 8465 1726776623.87675: stderr chunk (state=2): >>><<< 8465 1726776623.87685: stdout chunk (state=2): >>><<< 8465 1726776623.87700: _low_level_execute_command() done: rc=0, stdout=, stderr= 8465 1726776623.87705: _low_level_execute_command(): starting 8465 1726776623.87710: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401/AnsiballZ_systemd.py && sleep 0' 8465 1726776624.49138: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:06:55 EDT", "WatchdogTimestampMonotonic": "26858152", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "666", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ExecMainStartTimestampMonotonic": "25629470", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "666", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:54 EDT] ; stop_time=[n/a] ; pid=666 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18616320", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": <<< 8465 1726776624.49183: stdout chunk (state=3): >>>"infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:06:55 EDT", "StateChang<<< 8465 1726776624.49194: stdout chunk (state=3): >>>eTimestampMonotonic": "26858157", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:54 EDT", "InactiveExitTimestampMonotonic": "25629508", "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:55 EDT", "ActiveEnterTimestampMonotonic": "26858157", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ConditionTimestampMonotonic": "25628623", "AssertTimestamp": "Thu 2024-09-19 16:06:54 EDT", "AssertTimestampMonotonic": "25628626", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "c819ef5c23aa4fc8a68e209b78418d95", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8465 1726776624.51145: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8465 1726776624.51154: stdout chunk (state=3): >>><<< 8465 1726776624.51164: stderr chunk (state=3): >>><<< 8465 1726776624.51180: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:06:55 EDT", "WatchdogTimestampMonotonic": "26858152", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "666", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ExecMainStartTimestampMonotonic": "25629470", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "666", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:54 EDT] ; stop_time=[n/a] ; pid=666 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18616320", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:06:55 EDT", "StateChangeTimestampMonotonic": "26858157", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:54 EDT", "InactiveExitTimestampMonotonic": "25629508", "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:55 EDT", "ActiveEnterTimestampMonotonic": "26858157", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ConditionTimestampMonotonic": "25628623", "AssertTimestamp": "Thu 2024-09-19 16:06:54 EDT", "AssertTimestampMonotonic": "25628626", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "c819ef5c23aa4fc8a68e209b78418d95", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8465 1726776624.51309: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8465 1726776624.51332: _low_level_execute_command(): starting 8465 1726776624.51338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776623.5208344-8465-208988261186401/ > /dev/null 2>&1 && sleep 0' 8465 1726776624.54736: stderr chunk (state=2): >>><<< 8465 1726776624.54746: stdout chunk (state=2): >>><<< 8465 1726776624.54763: _low_level_execute_command() done: rc=0, stdout=, stderr= 8465 1726776624.54774: handler run complete 8465 1726776624.54823: attempt loop complete, returning result 8465 1726776624.54831: _execute() done 8465 1726776624.54835: dumping result to json 8465 1726776624.54852: done dumping result, returning 8465 1726776624.54859: done running TaskExecutor() for managed_node2/TASK: Ensure required services are enabled and started [120fa90a-8a95-cec2-986e-00000000000e] 8465 1726776624.54866: sending task result for task 120fa90a-8a95-cec2-986e-00000000000e 8465 1726776624.54958: done sending task result for task 120fa90a-8a95-cec2-986e-00000000000e 8465 1726776624.54962: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "enabled": true, "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:06:55 EDT", "ActiveEnterTimestampMonotonic": "26858157", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:06:54 EDT", "AssertTimestampMonotonic": "25628626", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ConditionTimestampMonotonic": "25628623", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "666", "ExecMainStartTimestamp": "Thu 2024-09-19 16:06:54 EDT", "ExecMainStartTimestampMonotonic": "25629470", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:06:54 EDT] ; stop_time=[n/a] ; pid=666 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:06:54 EDT", "InactiveExitTimestampMonotonic": "25629508", "InvocationID": "c819ef5c23aa4fc8a68e209b78418d95", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "666", "MemoryAccounting": "yes", "MemoryCurrent": "18616320", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:06:55 EDT", "StateChangeTimestampMonotonic": "26858157", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:06:55 EDT", "WatchdogTimestampMonotonic": "26858152", "WatchdogUSec": "0" } } 8218 1726776624.55672: no more pending results, returning what we have 8218 1726776624.55676: results queue empty 8218 1726776624.55676: checking for any_errors_fatal 8218 1726776624.55681: done checking for any_errors_fatal 8218 1726776624.55682: checking for max_fail_percentage 8218 1726776624.55683: done checking for max_fail_percentage 8218 1726776624.55684: checking to see if all hosts have failed and the running result is not ok 8218 1726776624.55684: done checking to see if all hosts have failed 8218 1726776624.55685: getting the remaining hosts for this loop 8218 1726776624.55686: done getting the remaining hosts for this loop 8218 1726776624.55689: getting the next task for host managed_node2 8218 1726776624.55693: done getting next task for host managed_node2 8218 1726776624.55695: ^ task is: TASK: Apply kernel_settings 8218 1726776624.55697: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776624.55700: getting variables 8218 1726776624.55701: in VariableManager get_vars() 8218 1726776624.55725: Calling all_inventory to load vars for managed_node2 8218 1726776624.55727: Calling groups_inventory to load vars for managed_node2 8218 1726776624.55731: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.55739: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.55741: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.55743: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.55886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.56056: done with get_vars() 8218 1726776624.56065: done getting variables TASK [Apply kernel_settings] *************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:57 Thursday 19 September 2024 16:10:24 -0400 (0:00:01.081) 0:00:10.392 **** 8218 1726776624.56146: entering _queue_task() for managed_node2/include_role 8218 1726776624.56147: Creating lock for include_role 8218 1726776624.56320: worker is 1 (out of 1 available) 8218 1726776624.56333: exiting _queue_task() for managed_node2/include_role 8218 1726776624.56344: done queuing things up, now waiting for results queue to drain 8218 1726776624.56345: waiting for pending results... 8504 1726776624.56518: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings 8504 1726776624.56636: in run() - task 120fa90a-8a95-cec2-986e-00000000000f 8504 1726776624.56655: variable 'ansible_search_path' from source: unknown 8504 1726776624.56690: calling self._execute() 8504 1726776624.56754: variable 'ansible_host' from source: host vars for 'managed_node2' 8504 1726776624.56763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8504 1726776624.56776: variable 'omit' from source: magic vars 8504 1726776624.56867: _execute() done 8504 1726776624.56875: dumping result to json 8504 1726776624.56880: done dumping result, returning 8504 1726776624.56885: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings [120fa90a-8a95-cec2-986e-00000000000f] 8504 1726776624.56892: sending task result for task 120fa90a-8a95-cec2-986e-00000000000f 8504 1726776624.56932: done sending task result for task 120fa90a-8a95-cec2-986e-00000000000f 8504 1726776624.56937: WORKER PROCESS EXITING 8218 1726776624.57241: no more pending results, returning what we have 8218 1726776624.57245: in VariableManager get_vars() 8218 1726776624.57273: Calling all_inventory to load vars for managed_node2 8218 1726776624.57276: Calling groups_inventory to load vars for managed_node2 8218 1726776624.57279: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.57287: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.57290: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.57293: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.57476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.57649: done with get_vars() 8218 1726776624.57657: variable 'ansible_search_path' from source: unknown 8218 1726776624.58377: variable 'omit' from source: magic vars 8218 1726776624.58402: variable 'omit' from source: magic vars 8218 1726776624.58418: variable 'omit' from source: magic vars 8218 1726776624.58422: we have included files to process 8218 1726776624.58423: generating all_blocks data 8218 1726776624.58424: done generating all_blocks data 8218 1726776624.58425: processing included file: fedora.linux_system_roles.kernel_settings 8218 1726776624.58450: in VariableManager get_vars() 8218 1726776624.58464: done with get_vars() 8218 1726776624.58537: in VariableManager get_vars() 8218 1726776624.58551: done with get_vars() 8218 1726776624.58593: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8218 1726776624.58914: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8218 1726776624.58980: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8218 1726776624.59115: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8218 1726776624.59939: in VariableManager get_vars() 8218 1726776624.59961: done with get_vars() 8218 1726776624.61146: in VariableManager get_vars() 8218 1726776624.61165: done with get_vars() 8218 1726776624.61324: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8218 1726776624.62040: iterating over new_blocks loaded from include file 8218 1726776624.62043: in VariableManager get_vars() 8218 1726776624.62058: done with get_vars() 8218 1726776624.62059: filtering new block on tags 8218 1726776624.62094: done filtering new block on tags 8218 1726776624.62098: in VariableManager get_vars() 8218 1726776624.62110: done with get_vars() 8218 1726776624.62112: filtering new block on tags 8218 1726776624.62148: done filtering new block on tags 8218 1726776624.62151: in VariableManager get_vars() 8218 1726776624.62163: done with get_vars() 8218 1726776624.62164: filtering new block on tags 8218 1726776624.62315: done filtering new block on tags 8218 1726776624.62318: in VariableManager get_vars() 8218 1726776624.62333: done with get_vars() 8218 1726776624.62335: filtering new block on tags 8218 1726776624.62351: done filtering new block on tags 8218 1726776624.62353: done iterating over new_blocks loaded from include file 8218 1726776624.62353: extending task lists for all hosts with included blocks 8218 1726776624.63022: done extending task lists 8218 1726776624.63024: done processing included files 8218 1726776624.63024: results queue empty 8218 1726776624.63025: checking for any_errors_fatal 8218 1726776624.63036: done checking for any_errors_fatal 8218 1726776624.63037: checking for max_fail_percentage 8218 1726776624.63038: done checking for max_fail_percentage 8218 1726776624.63039: checking to see if all hosts have failed and the running result is not ok 8218 1726776624.63039: done checking to see if all hosts have failed 8218 1726776624.63040: getting the remaining hosts for this loop 8218 1726776624.63041: done getting the remaining hosts for this loop 8218 1726776624.63044: getting the next task for host managed_node2 8218 1726776624.63048: done getting next task for host managed_node2 8218 1726776624.63050: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8218 1726776624.63052: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776624.63061: getting variables 8218 1726776624.63062: in VariableManager get_vars() 8218 1726776624.63078: Calling all_inventory to load vars for managed_node2 8218 1726776624.63081: Calling groups_inventory to load vars for managed_node2 8218 1726776624.63082: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.63088: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.63090: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.63092: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.63234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.63425: done with get_vars() 8218 1726776624.63435: done getting variables 8218 1726776624.63503: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.073) 0:00:10.465 **** 8218 1726776624.63534: entering _queue_task() for managed_node2/fail 8218 1726776624.63536: Creating lock for fail 8218 1726776624.63784: worker is 1 (out of 1 available) 8218 1726776624.63797: exiting _queue_task() for managed_node2/fail 8218 1726776624.63808: done queuing things up, now waiting for results queue to drain 8218 1726776624.63809: waiting for pending results... 8507 1726776624.64008: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8507 1726776624.64143: in run() - task 120fa90a-8a95-cec2-986e-0000000000ad 8507 1726776624.64160: variable 'ansible_search_path' from source: unknown 8507 1726776624.64164: variable 'ansible_search_path' from source: unknown 8507 1726776624.64197: calling self._execute() 8507 1726776624.64267: variable 'ansible_host' from source: host vars for 'managed_node2' 8507 1726776624.64279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8507 1726776624.64288: variable 'omit' from source: magic vars 8507 1726776624.64777: variable 'kernel_settings_sysctl' from source: include params 8507 1726776624.64791: variable '__kernel_settings_state_empty' from source: role '' all vars 8507 1726776624.64799: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 8507 1726776624.65118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8507 1726776624.67216: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8507 1726776624.67297: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8507 1726776624.67337: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8507 1726776624.67375: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8507 1726776624.67401: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8507 1726776624.67476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8507 1726776624.67505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8507 1726776624.67533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8507 1726776624.67575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8507 1726776624.67590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8507 1726776624.67644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8507 1726776624.67671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8507 1726776624.67695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8507 1726776624.67736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8507 1726776624.67750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8507 1726776624.67793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8507 1726776624.67815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8507 1726776624.67840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8507 1726776624.67880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8507 1726776624.67894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8507 1726776624.68232: variable 'kernel_settings_sysctl' from source: include params 8507 1726776624.68308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8507 1726776624.68467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8507 1726776624.68504: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8507 1726776624.68537: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8507 1726776624.68565: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8507 1726776624.68608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8507 1726776624.68631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8507 1726776624.68655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8507 1726776624.68683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8507 1726776624.68725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8507 1726776624.68747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8507 1726776624.68772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8507 1726776624.68796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8507 1726776624.68821: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 8507 1726776624.68827: when evaluation is False, skipping this task 8507 1726776624.68833: _execute() done 8507 1726776624.68836: dumping result to json 8507 1726776624.68840: done dumping result, returning 8507 1726776624.68847: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-cec2-986e-0000000000ad] 8507 1726776624.68853: sending task result for task 120fa90a-8a95-cec2-986e-0000000000ad 8507 1726776624.68886: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000ad 8507 1726776624.68890: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8218 1726776624.69273: no more pending results, returning what we have 8218 1726776624.69276: results queue empty 8218 1726776624.69277: checking for any_errors_fatal 8218 1726776624.69278: done checking for any_errors_fatal 8218 1726776624.69279: checking for max_fail_percentage 8218 1726776624.69281: done checking for max_fail_percentage 8218 1726776624.69281: checking to see if all hosts have failed and the running result is not ok 8218 1726776624.69282: done checking to see if all hosts have failed 8218 1726776624.69283: getting the remaining hosts for this loop 8218 1726776624.69284: done getting the remaining hosts for this loop 8218 1726776624.69287: getting the next task for host managed_node2 8218 1726776624.69293: done getting next task for host managed_node2 8218 1726776624.69296: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8218 1726776624.69299: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776624.69311: getting variables 8218 1726776624.69312: in VariableManager get_vars() 8218 1726776624.69349: Calling all_inventory to load vars for managed_node2 8218 1726776624.69353: Calling groups_inventory to load vars for managed_node2 8218 1726776624.69355: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.69362: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.69365: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.69367: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.69552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.69678: done with get_vars() 8218 1726776624.69686: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.062) 0:00:10.528 **** 8218 1726776624.69752: entering _queue_task() for managed_node2/include_tasks 8218 1726776624.69753: Creating lock for include_tasks 8218 1726776624.69910: worker is 1 (out of 1 available) 8218 1726776624.69923: exiting _queue_task() for managed_node2/include_tasks 8218 1726776624.69936: done queuing things up, now waiting for results queue to drain 8218 1726776624.69937: waiting for pending results... 8510 1726776624.70039: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8510 1726776624.70138: in run() - task 120fa90a-8a95-cec2-986e-0000000000ae 8510 1726776624.70154: variable 'ansible_search_path' from source: unknown 8510 1726776624.70159: variable 'ansible_search_path' from source: unknown 8510 1726776624.70187: calling self._execute() 8510 1726776624.70240: variable 'ansible_host' from source: host vars for 'managed_node2' 8510 1726776624.70249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8510 1726776624.70257: variable 'omit' from source: magic vars 8510 1726776624.70327: _execute() done 8510 1726776624.70335: dumping result to json 8510 1726776624.70339: done dumping result, returning 8510 1726776624.70345: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-cec2-986e-0000000000ae] 8510 1726776624.70352: sending task result for task 120fa90a-8a95-cec2-986e-0000000000ae 8510 1726776624.70378: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000ae 8510 1726776624.70382: WORKER PROCESS EXITING 8218 1726776624.70508: no more pending results, returning what we have 8218 1726776624.70511: in VariableManager get_vars() 8218 1726776624.70537: Calling all_inventory to load vars for managed_node2 8218 1726776624.70539: Calling groups_inventory to load vars for managed_node2 8218 1726776624.70540: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.70551: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.70558: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.70561: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.70671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.70779: done with get_vars() 8218 1726776624.70784: variable 'ansible_search_path' from source: unknown 8218 1726776624.70785: variable 'ansible_search_path' from source: unknown 8218 1726776624.70808: we have included files to process 8218 1726776624.70809: generating all_blocks data 8218 1726776624.70810: done generating all_blocks data 8218 1726776624.70815: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776624.70816: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776624.70817: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8218 1726776624.71453: done processing included file 8218 1726776624.71456: iterating over new_blocks loaded from include file 8218 1726776624.71457: in VariableManager get_vars() 8218 1726776624.71475: done with get_vars() 8218 1726776624.71476: filtering new block on tags 8218 1726776624.71492: done filtering new block on tags 8218 1726776624.71494: in VariableManager get_vars() 8218 1726776624.71506: done with get_vars() 8218 1726776624.71507: filtering new block on tags 8218 1726776624.71528: done filtering new block on tags 8218 1726776624.71532: in VariableManager get_vars() 8218 1726776624.71544: done with get_vars() 8218 1726776624.71545: filtering new block on tags 8218 1726776624.71564: done filtering new block on tags 8218 1726776624.71566: in VariableManager get_vars() 8218 1726776624.71582: done with get_vars() 8218 1726776624.71583: filtering new block on tags 8218 1726776624.71596: done filtering new block on tags 8218 1726776624.71598: done iterating over new_blocks loaded from include file 8218 1726776624.71598: extending task lists for all hosts with included blocks 8218 1726776624.71760: done extending task lists 8218 1726776624.71762: done processing included files 8218 1726776624.71762: results queue empty 8218 1726776624.71763: checking for any_errors_fatal 8218 1726776624.71765: done checking for any_errors_fatal 8218 1726776624.71766: checking for max_fail_percentage 8218 1726776624.71767: done checking for max_fail_percentage 8218 1726776624.71770: checking to see if all hosts have failed and the running result is not ok 8218 1726776624.71771: done checking to see if all hosts have failed 8218 1726776624.71771: getting the remaining hosts for this loop 8218 1726776624.71772: done getting the remaining hosts for this loop 8218 1726776624.71774: getting the next task for host managed_node2 8218 1726776624.71778: done getting next task for host managed_node2 8218 1726776624.71780: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8218 1726776624.71782: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776624.71790: getting variables 8218 1726776624.71791: in VariableManager get_vars() 8218 1726776624.71802: Calling all_inventory to load vars for managed_node2 8218 1726776624.71804: Calling groups_inventory to load vars for managed_node2 8218 1726776624.71806: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.71810: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.71812: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.71814: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.71945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.72145: done with get_vars() 8218 1726776624.72153: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.024) 0:00:10.552 **** 8218 1726776624.72215: entering _queue_task() for managed_node2/setup 8218 1726776624.72384: worker is 1 (out of 1 available) 8218 1726776624.72400: exiting _queue_task() for managed_node2/setup 8218 1726776624.72413: done queuing things up, now waiting for results queue to drain 8218 1726776624.72417: waiting for pending results... 8512 1726776624.72589: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8512 1726776624.72718: in run() - task 120fa90a-8a95-cec2-986e-000000000159 8512 1726776624.72737: variable 'ansible_search_path' from source: unknown 8512 1726776624.72742: variable 'ansible_search_path' from source: unknown 8512 1726776624.72771: calling self._execute() 8512 1726776624.72838: variable 'ansible_host' from source: host vars for 'managed_node2' 8512 1726776624.72847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8512 1726776624.72856: variable 'omit' from source: magic vars 8512 1726776624.73335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8512 1726776624.75039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8512 1726776624.75100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8512 1726776624.75137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8512 1726776624.75171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8512 1726776624.75197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8512 1726776624.75452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8512 1726776624.75481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8512 1726776624.75505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8512 1726776624.75575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8512 1726776624.75591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8512 1726776624.75645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8512 1726776624.75668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8512 1726776624.75694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8512 1726776624.75737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8512 1726776624.75753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8512 1726776624.75914: variable '__kernel_settings_required_facts' from source: role '' all vars 8512 1726776624.75924: variable 'ansible_facts' from source: unknown 8512 1726776624.76004: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8512 1726776624.76010: when evaluation is False, skipping this task 8512 1726776624.76014: _execute() done 8512 1726776624.76017: dumping result to json 8512 1726776624.76020: done dumping result, returning 8512 1726776624.76026: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-cec2-986e-000000000159] 8512 1726776624.76035: sending task result for task 120fa90a-8a95-cec2-986e-000000000159 8512 1726776624.76063: done sending task result for task 120fa90a-8a95-cec2-986e-000000000159 8512 1726776624.76066: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8218 1726776624.76410: no more pending results, returning what we have 8218 1726776624.76413: results queue empty 8218 1726776624.76414: checking for any_errors_fatal 8218 1726776624.76416: done checking for any_errors_fatal 8218 1726776624.76416: checking for max_fail_percentage 8218 1726776624.76417: done checking for max_fail_percentage 8218 1726776624.76418: checking to see if all hosts have failed and the running result is not ok 8218 1726776624.76419: done checking to see if all hosts have failed 8218 1726776624.76419: getting the remaining hosts for this loop 8218 1726776624.76420: done getting the remaining hosts for this loop 8218 1726776624.76423: getting the next task for host managed_node2 8218 1726776624.76432: done getting next task for host managed_node2 8218 1726776624.76436: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8218 1726776624.76439: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776624.76450: getting variables 8218 1726776624.76451: in VariableManager get_vars() 8218 1726776624.76482: Calling all_inventory to load vars for managed_node2 8218 1726776624.76484: Calling groups_inventory to load vars for managed_node2 8218 1726776624.76486: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.76494: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.76497: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.76499: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.76660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.76887: done with get_vars() 8218 1726776624.76897: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.047) 0:00:10.600 **** 8218 1726776624.76985: entering _queue_task() for managed_node2/stat 8218 1726776624.77160: worker is 1 (out of 1 available) 8218 1726776624.77172: exiting _queue_task() for managed_node2/stat 8218 1726776624.77181: done queuing things up, now waiting for results queue to drain 8218 1726776624.77183: waiting for pending results... 8515 1726776624.77376: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8515 1726776624.77510: in run() - task 120fa90a-8a95-cec2-986e-00000000015b 8515 1726776624.77527: variable 'ansible_search_path' from source: unknown 8515 1726776624.77534: variable 'ansible_search_path' from source: unknown 8515 1726776624.77562: calling self._execute() 8515 1726776624.77623: variable 'ansible_host' from source: host vars for 'managed_node2' 8515 1726776624.77633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8515 1726776624.77641: variable 'omit' from source: magic vars 8515 1726776624.78048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8515 1726776624.78313: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8515 1726776624.78362: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8515 1726776624.78394: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8515 1726776624.78427: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8515 1726776624.78500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8515 1726776624.78525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8515 1726776624.78556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8515 1726776624.78580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8515 1726776624.78684: variable '__kernel_settings_is_ostree' from source: set_fact 8515 1726776624.78695: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 8515 1726776624.78699: when evaluation is False, skipping this task 8515 1726776624.78702: _execute() done 8515 1726776624.78706: dumping result to json 8515 1726776624.78709: done dumping result, returning 8515 1726776624.78714: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-cec2-986e-00000000015b] 8515 1726776624.78721: sending task result for task 120fa90a-8a95-cec2-986e-00000000015b 8515 1726776624.78750: done sending task result for task 120fa90a-8a95-cec2-986e-00000000015b 8515 1726776624.78754: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776624.79043: no more pending results, returning what we have 8218 1726776624.79046: results queue empty 8218 1726776624.79047: checking for any_errors_fatal 8218 1726776624.79052: done checking for any_errors_fatal 8218 1726776624.79053: checking for max_fail_percentage 8218 1726776624.79054: done checking for max_fail_percentage 8218 1726776624.79054: checking to see if all hosts have failed and the running result is not ok 8218 1726776624.79055: done checking to see if all hosts have failed 8218 1726776624.79056: getting the remaining hosts for this loop 8218 1726776624.79057: done getting the remaining hosts for this loop 8218 1726776624.79060: getting the next task for host managed_node2 8218 1726776624.79065: done getting next task for host managed_node2 8218 1726776624.79068: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8218 1726776624.79071: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776624.79082: getting variables 8218 1726776624.79083: in VariableManager get_vars() 8218 1726776624.79110: Calling all_inventory to load vars for managed_node2 8218 1726776624.79112: Calling groups_inventory to load vars for managed_node2 8218 1726776624.79114: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.79121: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.79124: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.79126: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.79281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.79481: done with get_vars() 8218 1726776624.79491: done getting variables 8218 1726776624.79542: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.025) 0:00:10.626 **** 8218 1726776624.79571: entering _queue_task() for managed_node2/set_fact 8218 1726776624.79735: worker is 1 (out of 1 available) 8218 1726776624.79747: exiting _queue_task() for managed_node2/set_fact 8218 1726776624.79757: done queuing things up, now waiting for results queue to drain 8218 1726776624.79759: waiting for pending results... 8516 1726776624.79957: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8516 1726776624.80090: in run() - task 120fa90a-8a95-cec2-986e-00000000015c 8516 1726776624.80105: variable 'ansible_search_path' from source: unknown 8516 1726776624.80109: variable 'ansible_search_path' from source: unknown 8516 1726776624.80139: calling self._execute() 8516 1726776624.80203: variable 'ansible_host' from source: host vars for 'managed_node2' 8516 1726776624.80213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8516 1726776624.80221: variable 'omit' from source: magic vars 8516 1726776624.80621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8516 1726776624.80953: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8516 1726776624.80992: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8516 1726776624.81027: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8516 1726776624.81063: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8516 1726776624.81134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8516 1726776624.81159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8516 1726776624.81183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8516 1726776624.81208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8516 1726776624.81310: variable '__kernel_settings_is_ostree' from source: set_fact 8516 1726776624.81321: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 8516 1726776624.81327: when evaluation is False, skipping this task 8516 1726776624.81332: _execute() done 8516 1726776624.81336: dumping result to json 8516 1726776624.81339: done dumping result, returning 8516 1726776624.81344: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-cec2-986e-00000000015c] 8516 1726776624.81350: sending task result for task 120fa90a-8a95-cec2-986e-00000000015c 8516 1726776624.81377: done sending task result for task 120fa90a-8a95-cec2-986e-00000000015c 8516 1726776624.81381: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776624.81691: no more pending results, returning what we have 8218 1726776624.81694: results queue empty 8218 1726776624.81695: checking for any_errors_fatal 8218 1726776624.81699: done checking for any_errors_fatal 8218 1726776624.81699: checking for max_fail_percentage 8218 1726776624.81701: done checking for max_fail_percentage 8218 1726776624.81701: checking to see if all hosts have failed and the running result is not ok 8218 1726776624.81702: done checking to see if all hosts have failed 8218 1726776624.81703: getting the remaining hosts for this loop 8218 1726776624.81704: done getting the remaining hosts for this loop 8218 1726776624.81707: getting the next task for host managed_node2 8218 1726776624.81713: done getting next task for host managed_node2 8218 1726776624.81716: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8218 1726776624.81720: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776624.81734: getting variables 8218 1726776624.81736: in VariableManager get_vars() 8218 1726776624.81765: Calling all_inventory to load vars for managed_node2 8218 1726776624.81768: Calling groups_inventory to load vars for managed_node2 8218 1726776624.81770: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776624.81778: Calling all_plugins_play to load vars for managed_node2 8218 1726776624.81780: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776624.81783: Calling groups_plugins_play to load vars for managed_node2 8218 1726776624.81979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776624.82179: done with get_vars() 8218 1726776624.82187: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.026) 0:00:10.653 **** 8218 1726776624.82273: entering _queue_task() for managed_node2/stat 8218 1726776624.82445: worker is 1 (out of 1 available) 8218 1726776624.82458: exiting _queue_task() for managed_node2/stat 8218 1726776624.82468: done queuing things up, now waiting for results queue to drain 8218 1726776624.82470: waiting for pending results... 8517 1726776624.82712: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8517 1726776624.82845: in run() - task 120fa90a-8a95-cec2-986e-00000000015e 8517 1726776624.82861: variable 'ansible_search_path' from source: unknown 8517 1726776624.82865: variable 'ansible_search_path' from source: unknown 8517 1726776624.82894: calling self._execute() 8517 1726776624.82960: variable 'ansible_host' from source: host vars for 'managed_node2' 8517 1726776624.82968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8517 1726776624.82977: variable 'omit' from source: magic vars 8517 1726776624.83388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8517 1726776624.83607: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8517 1726776624.83691: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8517 1726776624.83723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8517 1726776624.83761: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8517 1726776624.83832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8517 1726776624.83857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8517 1726776624.83881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8517 1726776624.83905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8517 1726776624.84016: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8517 1726776624.84025: variable 'omit' from source: magic vars 8517 1726776624.84080: variable 'omit' from source: magic vars 8517 1726776624.84108: variable 'omit' from source: magic vars 8517 1726776624.84132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8517 1726776624.84155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8517 1726776624.84172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8517 1726776624.84187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8517 1726776624.84196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8517 1726776624.84222: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8517 1726776624.84228: variable 'ansible_host' from source: host vars for 'managed_node2' 8517 1726776624.84234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8517 1726776624.84320: Set connection var ansible_connection to ssh 8517 1726776624.84330: Set connection var ansible_pipelining to False 8517 1726776624.84336: Set connection var ansible_timeout to 10 8517 1726776624.84344: Set connection var ansible_module_compression to ZIP_DEFLATED 8517 1726776624.84349: Set connection var ansible_shell_type to sh 8517 1726776624.84354: Set connection var ansible_shell_executable to /bin/sh 8517 1726776624.84371: variable 'ansible_shell_executable' from source: unknown 8517 1726776624.84376: variable 'ansible_connection' from source: unknown 8517 1726776624.84379: variable 'ansible_module_compression' from source: unknown 8517 1726776624.84382: variable 'ansible_shell_type' from source: unknown 8517 1726776624.84385: variable 'ansible_shell_executable' from source: unknown 8517 1726776624.84388: variable 'ansible_host' from source: host vars for 'managed_node2' 8517 1726776624.84392: variable 'ansible_pipelining' from source: unknown 8517 1726776624.84395: variable 'ansible_timeout' from source: unknown 8517 1726776624.84398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8517 1726776624.84522: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8517 1726776624.84535: variable 'omit' from source: magic vars 8517 1726776624.84542: starting attempt loop 8517 1726776624.84546: running the handler 8517 1726776624.84557: _low_level_execute_command(): starting 8517 1726776624.84565: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8517 1726776624.87148: stdout chunk (state=2): >>>/root <<< 8517 1726776624.87394: stderr chunk (state=3): >>><<< 8517 1726776624.87401: stdout chunk (state=3): >>><<< 8517 1726776624.87418: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8517 1726776624.87432: _low_level_execute_command(): starting 8517 1726776624.87441: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408 `" && echo ansible-tmp-1726776624.8742735-8517-730543833408="` echo /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408 `" ) && sleep 0' 8517 1726776624.90535: stdout chunk (state=2): >>>ansible-tmp-1726776624.8742735-8517-730543833408=/root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408 <<< 8517 1726776624.90546: stderr chunk (state=2): >>><<< 8517 1726776624.90556: stdout chunk (state=3): >>><<< 8517 1726776624.90568: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776624.8742735-8517-730543833408=/root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408 , stderr= 8517 1726776624.90612: variable 'ansible_module_compression' from source: unknown 8517 1726776624.90672: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8517 1726776624.90707: variable 'ansible_facts' from source: unknown 8517 1726776624.90808: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408/AnsiballZ_stat.py 8517 1726776624.92202: Sending initial data 8517 1726776624.92210: Sent initial data (148 bytes) 8517 1726776624.95477: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpk3lxxqgh /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408/AnsiballZ_stat.py <<< 8517 1726776624.96893: stderr chunk (state=3): >>><<< 8517 1726776624.96902: stdout chunk (state=3): >>><<< 8517 1726776624.96923: done transferring module to remote 8517 1726776624.96939: _low_level_execute_command(): starting 8517 1726776624.96948: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408/ /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408/AnsiballZ_stat.py && sleep 0' 8517 1726776625.00394: stderr chunk (state=2): >>><<< 8517 1726776625.00403: stdout chunk (state=2): >>><<< 8517 1726776625.00419: _low_level_execute_command() done: rc=0, stdout=, stderr= 8517 1726776625.00424: _low_level_execute_command(): starting 8517 1726776625.00431: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408/AnsiballZ_stat.py && sleep 0' 8517 1726776625.16767: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8517 1726776625.17886: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8517 1726776625.17899: stdout chunk (state=3): >>><<< 8517 1726776625.17910: stderr chunk (state=3): >>><<< 8517 1726776625.17922: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 8517 1726776625.18003: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8517 1726776625.18018: _low_level_execute_command(): starting 8517 1726776625.18024: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776624.8742735-8517-730543833408/ > /dev/null 2>&1 && sleep 0' 8517 1726776625.21236: stderr chunk (state=2): >>><<< 8517 1726776625.21247: stdout chunk (state=2): >>><<< 8517 1726776625.21267: _low_level_execute_command() done: rc=0, stdout=, stderr= 8517 1726776625.21278: handler run complete 8517 1726776625.21300: attempt loop complete, returning result 8517 1726776625.21305: _execute() done 8517 1726776625.21308: dumping result to json 8517 1726776625.21312: done dumping result, returning 8517 1726776625.21320: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-cec2-986e-00000000015e] 8517 1726776625.21329: sending task result for task 120fa90a-8a95-cec2-986e-00000000015e 8517 1726776625.21373: done sending task result for task 120fa90a-8a95-cec2-986e-00000000015e 8517 1726776625.21378: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8218 1726776625.21782: no more pending results, returning what we have 8218 1726776625.21785: results queue empty 8218 1726776625.21786: checking for any_errors_fatal 8218 1726776625.21791: done checking for any_errors_fatal 8218 1726776625.21792: checking for max_fail_percentage 8218 1726776625.21793: done checking for max_fail_percentage 8218 1726776625.21794: checking to see if all hosts have failed and the running result is not ok 8218 1726776625.21795: done checking to see if all hosts have failed 8218 1726776625.21795: getting the remaining hosts for this loop 8218 1726776625.21796: done getting the remaining hosts for this loop 8218 1726776625.21800: getting the next task for host managed_node2 8218 1726776625.21805: done getting next task for host managed_node2 8218 1726776625.21808: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8218 1726776625.21812: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776625.21824: getting variables 8218 1726776625.21825: in VariableManager get_vars() 8218 1726776625.21858: Calling all_inventory to load vars for managed_node2 8218 1726776625.21861: Calling groups_inventory to load vars for managed_node2 8218 1726776625.21865: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776625.21876: Calling all_plugins_play to load vars for managed_node2 8218 1726776625.21879: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776625.21882: Calling groups_plugins_play to load vars for managed_node2 8218 1726776625.22047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776625.22250: done with get_vars() 8218 1726776625.22262: done getting variables 8218 1726776625.22322: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.400) 0:00:11.054 **** 8218 1726776625.22355: entering _queue_task() for managed_node2/set_fact 8218 1726776625.22554: worker is 1 (out of 1 available) 8218 1726776625.22567: exiting _queue_task() for managed_node2/set_fact 8218 1726776625.22582: done queuing things up, now waiting for results queue to drain 8218 1726776625.22584: waiting for pending results... 8543 1726776625.22799: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8543 1726776625.22945: in run() - task 120fa90a-8a95-cec2-986e-00000000015f 8543 1726776625.22965: variable 'ansible_search_path' from source: unknown 8543 1726776625.22972: variable 'ansible_search_path' from source: unknown 8543 1726776625.23005: calling self._execute() 8543 1726776625.23085: variable 'ansible_host' from source: host vars for 'managed_node2' 8543 1726776625.23095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8543 1726776625.23105: variable 'omit' from source: magic vars 8543 1726776625.23545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8543 1726776625.23809: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8543 1726776625.23841: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8543 1726776625.23865: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8543 1726776625.23891: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8543 1726776625.23946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8543 1726776625.23963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8543 1726776625.23983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8543 1726776625.23999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8543 1726776625.24086: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8543 1726776625.24093: variable 'omit' from source: magic vars 8543 1726776625.24132: variable 'omit' from source: magic vars 8543 1726776625.24219: variable '__transactional_update_stat' from source: set_fact 8543 1726776625.24254: variable 'omit' from source: magic vars 8543 1726776625.24273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8543 1726776625.24291: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8543 1726776625.24304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8543 1726776625.24315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8543 1726776625.24323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8543 1726776625.24344: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8543 1726776625.24348: variable 'ansible_host' from source: host vars for 'managed_node2' 8543 1726776625.24351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8543 1726776625.24409: Set connection var ansible_connection to ssh 8543 1726776625.24415: Set connection var ansible_pipelining to False 8543 1726776625.24419: Set connection var ansible_timeout to 10 8543 1726776625.24425: Set connection var ansible_module_compression to ZIP_DEFLATED 8543 1726776625.24432: Set connection var ansible_shell_type to sh 8543 1726776625.24438: Set connection var ansible_shell_executable to /bin/sh 8543 1726776625.24459: variable 'ansible_shell_executable' from source: unknown 8543 1726776625.24463: variable 'ansible_connection' from source: unknown 8543 1726776625.24466: variable 'ansible_module_compression' from source: unknown 8543 1726776625.24472: variable 'ansible_shell_type' from source: unknown 8543 1726776625.24476: variable 'ansible_shell_executable' from source: unknown 8543 1726776625.24479: variable 'ansible_host' from source: host vars for 'managed_node2' 8543 1726776625.24483: variable 'ansible_pipelining' from source: unknown 8543 1726776625.24486: variable 'ansible_timeout' from source: unknown 8543 1726776625.24490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8543 1726776625.24559: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8543 1726776625.24572: variable 'omit' from source: magic vars 8543 1726776625.24578: starting attempt loop 8543 1726776625.24582: running the handler 8543 1726776625.24591: handler run complete 8543 1726776625.24599: attempt loop complete, returning result 8543 1726776625.24602: _execute() done 8543 1726776625.24605: dumping result to json 8543 1726776625.24608: done dumping result, returning 8543 1726776625.24614: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-cec2-986e-00000000015f] 8543 1726776625.24620: sending task result for task 120fa90a-8a95-cec2-986e-00000000015f 8543 1726776625.24640: done sending task result for task 120fa90a-8a95-cec2-986e-00000000015f 8543 1726776625.24644: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_is_transactional": false }, "changed": false } 8218 1726776625.24763: no more pending results, returning what we have 8218 1726776625.24766: results queue empty 8218 1726776625.24767: checking for any_errors_fatal 8218 1726776625.24773: done checking for any_errors_fatal 8218 1726776625.24773: checking for max_fail_percentage 8218 1726776625.24774: done checking for max_fail_percentage 8218 1726776625.24775: checking to see if all hosts have failed and the running result is not ok 8218 1726776625.24776: done checking to see if all hosts have failed 8218 1726776625.24776: getting the remaining hosts for this loop 8218 1726776625.24777: done getting the remaining hosts for this loop 8218 1726776625.24780: getting the next task for host managed_node2 8218 1726776625.24788: done getting next task for host managed_node2 8218 1726776625.24791: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8218 1726776625.24794: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776625.24803: getting variables 8218 1726776625.24809: in VariableManager get_vars() 8218 1726776625.24840: Calling all_inventory to load vars for managed_node2 8218 1726776625.24843: Calling groups_inventory to load vars for managed_node2 8218 1726776625.24844: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776625.24852: Calling all_plugins_play to load vars for managed_node2 8218 1726776625.24854: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776625.24856: Calling groups_plugins_play to load vars for managed_node2 8218 1726776625.24992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776625.25108: done with get_vars() 8218 1726776625.25115: done getting variables 8218 1726776625.25194: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.028) 0:00:11.082 **** 8218 1726776625.25216: entering _queue_task() for managed_node2/include_vars 8218 1726776625.25217: Creating lock for include_vars 8218 1726776625.25403: worker is 1 (out of 1 available) 8218 1726776625.25414: exiting _queue_task() for managed_node2/include_vars 8218 1726776625.25424: done queuing things up, now waiting for results queue to drain 8218 1726776625.25426: waiting for pending results... 8545 1726776625.25760: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8545 1726776625.25906: in run() - task 120fa90a-8a95-cec2-986e-000000000161 8545 1726776625.25922: variable 'ansible_search_path' from source: unknown 8545 1726776625.25926: variable 'ansible_search_path' from source: unknown 8545 1726776625.25958: calling self._execute() 8545 1726776625.26030: variable 'ansible_host' from source: host vars for 'managed_node2' 8545 1726776625.26039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8545 1726776625.26047: variable 'omit' from source: magic vars 8545 1726776625.26145: variable 'omit' from source: magic vars 8545 1726776625.26211: variable 'omit' from source: magic vars 8545 1726776625.26541: variable 'ffparams' from source: task vars 8545 1726776625.26699: variable 'ansible_facts' from source: unknown 8545 1726776625.26896: variable 'ansible_facts' from source: unknown 8545 1726776625.26998: variable 'ansible_facts' from source: unknown 8545 1726776625.27088: variable 'ansible_facts' from source: unknown 8545 1726776625.27165: variable 'role_path' from source: magic vars 8545 1726776625.27290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8545 1726776625.27633: Loaded config def from plugin (lookup/first_found) 8545 1726776625.27642: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 8545 1726776625.27671: variable 'ansible_search_path' from source: unknown 8545 1726776625.27691: variable 'ansible_search_path' from source: unknown 8545 1726776625.27700: variable 'ansible_search_path' from source: unknown 8545 1726776625.27708: variable 'ansible_search_path' from source: unknown 8545 1726776625.27714: variable 'ansible_search_path' from source: unknown 8545 1726776625.27733: variable 'omit' from source: magic vars 8545 1726776625.27752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8545 1726776625.27771: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8545 1726776625.27789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8545 1726776625.27805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8545 1726776625.27814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8545 1726776625.27839: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8545 1726776625.27844: variable 'ansible_host' from source: host vars for 'managed_node2' 8545 1726776625.27848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8545 1726776625.27912: Set connection var ansible_connection to ssh 8545 1726776625.27920: Set connection var ansible_pipelining to False 8545 1726776625.27926: Set connection var ansible_timeout to 10 8545 1726776625.27935: Set connection var ansible_module_compression to ZIP_DEFLATED 8545 1726776625.27940: Set connection var ansible_shell_type to sh 8545 1726776625.27945: Set connection var ansible_shell_executable to /bin/sh 8545 1726776625.27962: variable 'ansible_shell_executable' from source: unknown 8545 1726776625.27966: variable 'ansible_connection' from source: unknown 8545 1726776625.27969: variable 'ansible_module_compression' from source: unknown 8545 1726776625.27972: variable 'ansible_shell_type' from source: unknown 8545 1726776625.27975: variable 'ansible_shell_executable' from source: unknown 8545 1726776625.27979: variable 'ansible_host' from source: host vars for 'managed_node2' 8545 1726776625.27983: variable 'ansible_pipelining' from source: unknown 8545 1726776625.27986: variable 'ansible_timeout' from source: unknown 8545 1726776625.27990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8545 1726776625.28063: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8545 1726776625.28075: variable 'omit' from source: magic vars 8545 1726776625.28080: starting attempt loop 8545 1726776625.28084: running the handler 8545 1726776625.28127: handler run complete 8545 1726776625.28138: attempt loop complete, returning result 8545 1726776625.28142: _execute() done 8545 1726776625.28145: dumping result to json 8545 1726776625.28149: done dumping result, returning 8545 1726776625.28156: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-cec2-986e-000000000161] 8545 1726776625.28162: sending task result for task 120fa90a-8a95-cec2-986e-000000000161 8545 1726776625.28189: done sending task result for task 120fa90a-8a95-cec2-986e-000000000161 8545 1726776625.28193: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8218 1726776625.28334: no more pending results, returning what we have 8218 1726776625.28337: results queue empty 8218 1726776625.28338: checking for any_errors_fatal 8218 1726776625.28342: done checking for any_errors_fatal 8218 1726776625.28343: checking for max_fail_percentage 8218 1726776625.28344: done checking for max_fail_percentage 8218 1726776625.28345: checking to see if all hosts have failed and the running result is not ok 8218 1726776625.28346: done checking to see if all hosts have failed 8218 1726776625.28346: getting the remaining hosts for this loop 8218 1726776625.28347: done getting the remaining hosts for this loop 8218 1726776625.28350: getting the next task for host managed_node2 8218 1726776625.28357: done getting next task for host managed_node2 8218 1726776625.28360: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8218 1726776625.28362: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776625.28374: getting variables 8218 1726776625.28375: in VariableManager get_vars() 8218 1726776625.28410: Calling all_inventory to load vars for managed_node2 8218 1726776625.28412: Calling groups_inventory to load vars for managed_node2 8218 1726776625.28414: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776625.28423: Calling all_plugins_play to load vars for managed_node2 8218 1726776625.28426: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776625.28430: Calling groups_plugins_play to load vars for managed_node2 8218 1726776625.28590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776625.28793: done with get_vars() 8218 1726776625.28804: done getting variables 8218 1726776625.28860: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.036) 0:00:11.119 **** 8218 1726776625.28893: entering _queue_task() for managed_node2/package 8218 1726776625.29098: worker is 1 (out of 1 available) 8218 1726776625.29112: exiting _queue_task() for managed_node2/package 8218 1726776625.29122: done queuing things up, now waiting for results queue to drain 8218 1726776625.29123: waiting for pending results... 8549 1726776625.29339: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8549 1726776625.29475: in run() - task 120fa90a-8a95-cec2-986e-0000000000af 8549 1726776625.29493: variable 'ansible_search_path' from source: unknown 8549 1726776625.29497: variable 'ansible_search_path' from source: unknown 8549 1726776625.29528: calling self._execute() 8549 1726776625.29601: variable 'ansible_host' from source: host vars for 'managed_node2' 8549 1726776625.29611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8549 1726776625.29619: variable 'omit' from source: magic vars 8549 1726776625.29716: variable 'omit' from source: magic vars 8549 1726776625.29766: variable 'omit' from source: magic vars 8549 1726776625.29797: variable '__kernel_settings_packages' from source: include_vars 8549 1726776625.30068: variable '__kernel_settings_packages' from source: include_vars 8549 1726776625.30556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8549 1726776625.32413: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8549 1726776625.32489: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8549 1726776625.32541: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8549 1726776625.32578: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8549 1726776625.32605: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8549 1726776625.32700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8549 1726776625.32727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8549 1726776625.32755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8549 1726776625.32799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8549 1726776625.32814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8549 1726776625.32914: variable '__kernel_settings_is_ostree' from source: set_fact 8549 1726776625.32922: variable 'omit' from source: magic vars 8549 1726776625.32954: variable 'omit' from source: magic vars 8549 1726776625.32984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8549 1726776625.33010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8549 1726776625.33030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8549 1726776625.33048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8549 1726776625.33059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8549 1726776625.33090: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8549 1726776625.33095: variable 'ansible_host' from source: host vars for 'managed_node2' 8549 1726776625.33100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8549 1726776625.33200: Set connection var ansible_connection to ssh 8549 1726776625.33209: Set connection var ansible_pipelining to False 8549 1726776625.33216: Set connection var ansible_timeout to 10 8549 1726776625.33223: Set connection var ansible_module_compression to ZIP_DEFLATED 8549 1726776625.33231: Set connection var ansible_shell_type to sh 8549 1726776625.33237: Set connection var ansible_shell_executable to /bin/sh 8549 1726776625.33259: variable 'ansible_shell_executable' from source: unknown 8549 1726776625.33264: variable 'ansible_connection' from source: unknown 8549 1726776625.33267: variable 'ansible_module_compression' from source: unknown 8549 1726776625.33273: variable 'ansible_shell_type' from source: unknown 8549 1726776625.33276: variable 'ansible_shell_executable' from source: unknown 8549 1726776625.33279: variable 'ansible_host' from source: host vars for 'managed_node2' 8549 1726776625.33282: variable 'ansible_pipelining' from source: unknown 8549 1726776625.33285: variable 'ansible_timeout' from source: unknown 8549 1726776625.33289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8549 1726776625.33383: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8549 1726776625.33396: variable 'omit' from source: magic vars 8549 1726776625.33402: starting attempt loop 8549 1726776625.33405: running the handler 8549 1726776625.33495: variable 'ansible_facts' from source: unknown 8549 1726776625.33613: _low_level_execute_command(): starting 8549 1726776625.33622: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8549 1726776625.36639: stdout chunk (state=2): >>>/root <<< 8549 1726776625.36766: stderr chunk (state=3): >>><<< 8549 1726776625.36775: stdout chunk (state=3): >>><<< 8549 1726776625.36791: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8549 1726776625.36800: _low_level_execute_command(): starting 8549 1726776625.36804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339 `" && echo ansible-tmp-1726776625.3679686-8549-147442884888339="` echo /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339 `" ) && sleep 0' 8549 1726776625.39418: stdout chunk (state=2): >>>ansible-tmp-1726776625.3679686-8549-147442884888339=/root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339 <<< 8549 1726776625.39545: stderr chunk (state=3): >>><<< 8549 1726776625.39552: stdout chunk (state=3): >>><<< 8549 1726776625.39565: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776625.3679686-8549-147442884888339=/root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339 , stderr= 8549 1726776625.39597: variable 'ansible_module_compression' from source: unknown 8549 1726776625.39642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 8549 1726776625.39683: variable 'ansible_facts' from source: unknown 8549 1726776625.39793: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339/AnsiballZ_dnf.py 8549 1726776625.40203: Sending initial data 8549 1726776625.40210: Sent initial data (150 bytes) 8549 1726776625.42683: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp19vkr_6c /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339/AnsiballZ_dnf.py <<< 8549 1726776625.44293: stderr chunk (state=3): >>><<< 8549 1726776625.44301: stdout chunk (state=3): >>><<< 8549 1726776625.44319: done transferring module to remote 8549 1726776625.44334: _low_level_execute_command(): starting 8549 1726776625.44343: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339/ /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339/AnsiballZ_dnf.py && sleep 0' 8549 1726776625.46933: stderr chunk (state=2): >>><<< 8549 1726776625.46945: stdout chunk (state=2): >>><<< 8549 1726776625.46959: _low_level_execute_command() done: rc=0, stdout=, stderr= 8549 1726776625.46964: _low_level_execute_command(): starting 8549 1726776625.46973: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339/AnsiballZ_dnf.py && sleep 0' 8549 1726776627.99859: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8549 1726776628.07704: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8549 1726776628.07748: stderr chunk (state=3): >>><<< 8549 1726776628.07757: stdout chunk (state=3): >>><<< 8549 1726776628.07771: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8549 1726776628.07817: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8549 1726776628.07827: _low_level_execute_command(): starting 8549 1726776628.07835: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776625.3679686-8549-147442884888339/ > /dev/null 2>&1 && sleep 0' 8549 1726776628.10476: stderr chunk (state=2): >>><<< 8549 1726776628.10487: stdout chunk (state=2): >>><<< 8549 1726776628.10501: _low_level_execute_command() done: rc=0, stdout=, stderr= 8549 1726776628.10510: handler run complete 8549 1726776628.10538: attempt loop complete, returning result 8549 1726776628.10543: _execute() done 8549 1726776628.10546: dumping result to json 8549 1726776628.10552: done dumping result, returning 8549 1726776628.10559: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-cec2-986e-0000000000af] 8549 1726776628.10565: sending task result for task 120fa90a-8a95-cec2-986e-0000000000af 8549 1726776628.10596: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000af 8549 1726776628.10600: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8218 1726776628.10757: no more pending results, returning what we have 8218 1726776628.10760: results queue empty 8218 1726776628.10760: checking for any_errors_fatal 8218 1726776628.10767: done checking for any_errors_fatal 8218 1726776628.10767: checking for max_fail_percentage 8218 1726776628.10769: done checking for max_fail_percentage 8218 1726776628.10770: checking to see if all hosts have failed and the running result is not ok 8218 1726776628.10770: done checking to see if all hosts have failed 8218 1726776628.10771: getting the remaining hosts for this loop 8218 1726776628.10772: done getting the remaining hosts for this loop 8218 1726776628.10777: getting the next task for host managed_node2 8218 1726776628.10784: done getting next task for host managed_node2 8218 1726776628.10788: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8218 1726776628.10790: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776628.10799: getting variables 8218 1726776628.10800: in VariableManager get_vars() 8218 1726776628.10831: Calling all_inventory to load vars for managed_node2 8218 1726776628.10834: Calling groups_inventory to load vars for managed_node2 8218 1726776628.10836: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776628.10843: Calling all_plugins_play to load vars for managed_node2 8218 1726776628.10845: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776628.10847: Calling groups_plugins_play to load vars for managed_node2 8218 1726776628.11102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776628.11214: done with get_vars() 8218 1726776628.11221: done getting variables 8218 1726776628.11291: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:10:28 -0400 (0:00:02.824) 0:00:13.943 **** 8218 1726776628.11312: entering _queue_task() for managed_node2/debug 8218 1726776628.11313: Creating lock for debug 8218 1726776628.11470: worker is 1 (out of 1 available) 8218 1726776628.11487: exiting _queue_task() for managed_node2/debug 8218 1726776628.11498: done queuing things up, now waiting for results queue to drain 8218 1726776628.11499: waiting for pending results... 8681 1726776628.11607: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8681 1726776628.11714: in run() - task 120fa90a-8a95-cec2-986e-0000000000b1 8681 1726776628.11733: variable 'ansible_search_path' from source: unknown 8681 1726776628.11740: variable 'ansible_search_path' from source: unknown 8681 1726776628.11766: calling self._execute() 8681 1726776628.11819: variable 'ansible_host' from source: host vars for 'managed_node2' 8681 1726776628.11825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8681 1726776628.11833: variable 'omit' from source: magic vars 8681 1726776628.12157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8681 1726776628.13652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8681 1726776628.13705: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8681 1726776628.13734: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8681 1726776628.13760: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8681 1726776628.13783: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8681 1726776628.13837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8681 1726776628.13858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8681 1726776628.13880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8681 1726776628.13909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8681 1726776628.13921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8681 1726776628.13995: variable '__kernel_settings_is_transactional' from source: set_fact 8681 1726776628.14011: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8681 1726776628.14015: when evaluation is False, skipping this task 8681 1726776628.14019: _execute() done 8681 1726776628.14022: dumping result to json 8681 1726776628.14026: done dumping result, returning 8681 1726776628.14033: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-0000000000b1] 8681 1726776628.14040: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b1 8681 1726776628.14061: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b1 8681 1726776628.14065: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8218 1726776628.14165: no more pending results, returning what we have 8218 1726776628.14168: results queue empty 8218 1726776628.14168: checking for any_errors_fatal 8218 1726776628.14174: done checking for any_errors_fatal 8218 1726776628.14175: checking for max_fail_percentage 8218 1726776628.14176: done checking for max_fail_percentage 8218 1726776628.14176: checking to see if all hosts have failed and the running result is not ok 8218 1726776628.14177: done checking to see if all hosts have failed 8218 1726776628.14178: getting the remaining hosts for this loop 8218 1726776628.14179: done getting the remaining hosts for this loop 8218 1726776628.14181: getting the next task for host managed_node2 8218 1726776628.14187: done getting next task for host managed_node2 8218 1726776628.14190: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8218 1726776628.14192: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776628.14204: getting variables 8218 1726776628.14205: in VariableManager get_vars() 8218 1726776628.14233: Calling all_inventory to load vars for managed_node2 8218 1726776628.14236: Calling groups_inventory to load vars for managed_node2 8218 1726776628.14238: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776628.14245: Calling all_plugins_play to load vars for managed_node2 8218 1726776628.14247: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776628.14249: Calling groups_plugins_play to load vars for managed_node2 8218 1726776628.14353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776628.14472: done with get_vars() 8218 1726776628.14479: done getting variables 8218 1726776628.14575: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.032) 0:00:13.976 **** 8218 1726776628.14597: entering _queue_task() for managed_node2/reboot 8218 1726776628.14598: Creating lock for reboot 8218 1726776628.14748: worker is 1 (out of 1 available) 8218 1726776628.14762: exiting _queue_task() for managed_node2/reboot 8218 1726776628.14772: done queuing things up, now waiting for results queue to drain 8218 1726776628.14773: waiting for pending results... 8682 1726776628.14886: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8682 1726776628.14989: in run() - task 120fa90a-8a95-cec2-986e-0000000000b2 8682 1726776628.15005: variable 'ansible_search_path' from source: unknown 8682 1726776628.15010: variable 'ansible_search_path' from source: unknown 8682 1726776628.15038: calling self._execute() 8682 1726776628.15097: variable 'ansible_host' from source: host vars for 'managed_node2' 8682 1726776628.15106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8682 1726776628.15115: variable 'omit' from source: magic vars 8682 1726776628.15490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8682 1726776628.16952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8682 1726776628.17011: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8682 1726776628.17043: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8682 1726776628.17070: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8682 1726776628.17096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8682 1726776628.17156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8682 1726776628.17180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8682 1726776628.17198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8682 1726776628.17227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8682 1726776628.17241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8682 1726776628.17316: variable '__kernel_settings_is_transactional' from source: set_fact 8682 1726776628.17335: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8682 1726776628.17340: when evaluation is False, skipping this task 8682 1726776628.17344: _execute() done 8682 1726776628.17347: dumping result to json 8682 1726776628.17351: done dumping result, returning 8682 1726776628.17358: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-cec2-986e-0000000000b2] 8682 1726776628.17365: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b2 8682 1726776628.17391: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b2 8682 1726776628.17395: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776628.17508: no more pending results, returning what we have 8218 1726776628.17511: results queue empty 8218 1726776628.17511: checking for any_errors_fatal 8218 1726776628.17517: done checking for any_errors_fatal 8218 1726776628.17517: checking for max_fail_percentage 8218 1726776628.17519: done checking for max_fail_percentage 8218 1726776628.17520: checking to see if all hosts have failed and the running result is not ok 8218 1726776628.17520: done checking to see if all hosts have failed 8218 1726776628.17521: getting the remaining hosts for this loop 8218 1726776628.17522: done getting the remaining hosts for this loop 8218 1726776628.17525: getting the next task for host managed_node2 8218 1726776628.17532: done getting next task for host managed_node2 8218 1726776628.17535: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8218 1726776628.17537: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776628.17549: getting variables 8218 1726776628.17550: in VariableManager get_vars() 8218 1726776628.17639: Calling all_inventory to load vars for managed_node2 8218 1726776628.17642: Calling groups_inventory to load vars for managed_node2 8218 1726776628.17643: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776628.17649: Calling all_plugins_play to load vars for managed_node2 8218 1726776628.17651: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776628.17653: Calling groups_plugins_play to load vars for managed_node2 8218 1726776628.17748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776628.17861: done with get_vars() 8218 1726776628.17868: done getting variables 8218 1726776628.17907: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.033) 0:00:14.009 **** 8218 1726776628.17928: entering _queue_task() for managed_node2/fail 8218 1726776628.18085: worker is 1 (out of 1 available) 8218 1726776628.18099: exiting _queue_task() for managed_node2/fail 8218 1726776628.18110: done queuing things up, now waiting for results queue to drain 8218 1726776628.18112: waiting for pending results... 8683 1726776628.18226: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8683 1726776628.18327: in run() - task 120fa90a-8a95-cec2-986e-0000000000b3 8683 1726776628.18345: variable 'ansible_search_path' from source: unknown 8683 1726776628.18349: variable 'ansible_search_path' from source: unknown 8683 1726776628.18378: calling self._execute() 8683 1726776628.18435: variable 'ansible_host' from source: host vars for 'managed_node2' 8683 1726776628.18443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8683 1726776628.18452: variable 'omit' from source: magic vars 8683 1726776628.18780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8683 1726776628.20289: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8683 1726776628.20371: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8683 1726776628.20407: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8683 1726776628.20436: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8683 1726776628.20465: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8683 1726776628.20533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8683 1726776628.20563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8683 1726776628.20586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8683 1726776628.20614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8683 1726776628.20630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8683 1726776628.20723: variable '__kernel_settings_is_transactional' from source: set_fact 8683 1726776628.20743: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8683 1726776628.20747: when evaluation is False, skipping this task 8683 1726776628.20750: _execute() done 8683 1726776628.20752: dumping result to json 8683 1726776628.20754: done dumping result, returning 8683 1726776628.20759: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-cec2-986e-0000000000b3] 8683 1726776628.20766: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b3 8683 1726776628.20798: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b3 8683 1726776628.20800: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776628.20986: no more pending results, returning what we have 8218 1726776628.20988: results queue empty 8218 1726776628.20989: checking for any_errors_fatal 8218 1726776628.20994: done checking for any_errors_fatal 8218 1726776628.20994: checking for max_fail_percentage 8218 1726776628.20996: done checking for max_fail_percentage 8218 1726776628.20996: checking to see if all hosts have failed and the running result is not ok 8218 1726776628.20997: done checking to see if all hosts have failed 8218 1726776628.20997: getting the remaining hosts for this loop 8218 1726776628.20999: done getting the remaining hosts for this loop 8218 1726776628.21001: getting the next task for host managed_node2 8218 1726776628.21008: done getting next task for host managed_node2 8218 1726776628.21011: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8218 1726776628.21013: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776628.21030: getting variables 8218 1726776628.21031: in VariableManager get_vars() 8218 1726776628.21065: Calling all_inventory to load vars for managed_node2 8218 1726776628.21068: Calling groups_inventory to load vars for managed_node2 8218 1726776628.21070: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776628.21079: Calling all_plugins_play to load vars for managed_node2 8218 1726776628.21081: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776628.21082: Calling groups_plugins_play to load vars for managed_node2 8218 1726776628.21203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776628.21340: done with get_vars() 8218 1726776628.21353: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.034) 0:00:14.044 **** 8218 1726776628.21421: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776628.21423: Creating lock for fedora.linux_system_roles.kernel_settings_get_config 8218 1726776628.21614: worker is 1 (out of 1 available) 8218 1726776628.21631: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776628.21643: done queuing things up, now waiting for results queue to drain 8218 1726776628.21645: waiting for pending results... 8684 1726776628.21761: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8684 1726776628.21878: in run() - task 120fa90a-8a95-cec2-986e-0000000000b5 8684 1726776628.21894: variable 'ansible_search_path' from source: unknown 8684 1726776628.21898: variable 'ansible_search_path' from source: unknown 8684 1726776628.21923: calling self._execute() 8684 1726776628.22061: variable 'ansible_host' from source: host vars for 'managed_node2' 8684 1726776628.22071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8684 1726776628.22079: variable 'omit' from source: magic vars 8684 1726776628.22159: variable 'omit' from source: magic vars 8684 1726776628.22196: variable 'omit' from source: magic vars 8684 1726776628.22221: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8684 1726776628.22477: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8684 1726776628.22541: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8684 1726776628.22582: variable 'omit' from source: magic vars 8684 1726776628.22648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8684 1726776628.22680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8684 1726776628.22700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8684 1726776628.22713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8684 1726776628.22724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8684 1726776628.22749: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8684 1726776628.22759: variable 'ansible_host' from source: host vars for 'managed_node2' 8684 1726776628.22763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8684 1726776628.22833: Set connection var ansible_connection to ssh 8684 1726776628.22840: Set connection var ansible_pipelining to False 8684 1726776628.22845: Set connection var ansible_timeout to 10 8684 1726776628.22850: Set connection var ansible_module_compression to ZIP_DEFLATED 8684 1726776628.22854: Set connection var ansible_shell_type to sh 8684 1726776628.22857: Set connection var ansible_shell_executable to /bin/sh 8684 1726776628.22870: variable 'ansible_shell_executable' from source: unknown 8684 1726776628.22873: variable 'ansible_connection' from source: unknown 8684 1726776628.22875: variable 'ansible_module_compression' from source: unknown 8684 1726776628.22877: variable 'ansible_shell_type' from source: unknown 8684 1726776628.22879: variable 'ansible_shell_executable' from source: unknown 8684 1726776628.22880: variable 'ansible_host' from source: host vars for 'managed_node2' 8684 1726776628.22882: variable 'ansible_pipelining' from source: unknown 8684 1726776628.22884: variable 'ansible_timeout' from source: unknown 8684 1726776628.22886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8684 1726776628.23024: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8684 1726776628.23040: variable 'omit' from source: magic vars 8684 1726776628.23050: starting attempt loop 8684 1726776628.23055: running the handler 8684 1726776628.23068: _low_level_execute_command(): starting 8684 1726776628.23076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8684 1726776628.25461: stdout chunk (state=2): >>>/root <<< 8684 1726776628.25598: stderr chunk (state=3): >>><<< 8684 1726776628.25606: stdout chunk (state=3): >>><<< 8684 1726776628.25626: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8684 1726776628.25643: _low_level_execute_command(): starting 8684 1726776628.25650: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761 `" && echo ansible-tmp-1726776628.2563667-8684-213426006795761="` echo /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761 `" ) && sleep 0' 8684 1726776628.28303: stdout chunk (state=2): >>>ansible-tmp-1726776628.2563667-8684-213426006795761=/root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761 <<< 8684 1726776628.28433: stderr chunk (state=3): >>><<< 8684 1726776628.28440: stdout chunk (state=3): >>><<< 8684 1726776628.28453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776628.2563667-8684-213426006795761=/root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761 , stderr= 8684 1726776628.28492: variable 'ansible_module_compression' from source: unknown 8684 1726776628.28521: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config 8684 1726776628.28526: ANSIBALLZ: Acquiring lock 8684 1726776628.28531: ANSIBALLZ: Lock acquired: 140571203766848 8684 1726776628.28536: ANSIBALLZ: Creating module 8684 1726776628.41476: ANSIBALLZ: Writing module into payload 8684 1726776628.41561: ANSIBALLZ: Writing module 8684 1726776628.41583: ANSIBALLZ: Renaming module 8684 1726776628.41591: ANSIBALLZ: Done creating module 8684 1726776628.41611: variable 'ansible_facts' from source: unknown 8684 1726776628.41696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761/AnsiballZ_kernel_settings_get_config.py 8684 1726776628.42127: Sending initial data 8684 1726776628.42136: Sent initial data (173 bytes) 8684 1726776628.47359: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpg_w4da36 /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761/AnsiballZ_kernel_settings_get_config.py <<< 8684 1726776628.50134: stderr chunk (state=3): >>><<< 8684 1726776628.50146: stdout chunk (state=3): >>><<< 8684 1726776628.50169: done transferring module to remote 8684 1726776628.50187: _low_level_execute_command(): starting 8684 1726776628.50193: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761/ /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8684 1726776628.53217: stderr chunk (state=2): >>><<< 8684 1726776628.53227: stdout chunk (state=2): >>><<< 8684 1726776628.53247: _low_level_execute_command() done: rc=0, stdout=, stderr= 8684 1726776628.53253: _low_level_execute_command(): starting 8684 1726776628.53259: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8684 1726776628.69378: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 8684 1726776628.70499: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8684 1726776628.70545: stderr chunk (state=3): >>><<< 8684 1726776628.70552: stdout chunk (state=3): >>><<< 8684 1726776628.70568: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 8684 1726776628.70596: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8684 1726776628.70607: _low_level_execute_command(): starting 8684 1726776628.70612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776628.2563667-8684-213426006795761/ > /dev/null 2>&1 && sleep 0' 8684 1726776628.73042: stderr chunk (state=2): >>><<< 8684 1726776628.73050: stdout chunk (state=2): >>><<< 8684 1726776628.73064: _low_level_execute_command() done: rc=0, stdout=, stderr= 8684 1726776628.73071: handler run complete 8684 1726776628.73085: attempt loop complete, returning result 8684 1726776628.73090: _execute() done 8684 1726776628.73093: dumping result to json 8684 1726776628.73098: done dumping result, returning 8684 1726776628.73105: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-cec2-986e-0000000000b5] 8684 1726776628.73111: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b5 8684 1726776628.73142: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b5 8684 1726776628.73146: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8218 1726776628.73347: no more pending results, returning what we have 8218 1726776628.73350: results queue empty 8218 1726776628.73351: checking for any_errors_fatal 8218 1726776628.73354: done checking for any_errors_fatal 8218 1726776628.73355: checking for max_fail_percentage 8218 1726776628.73356: done checking for max_fail_percentage 8218 1726776628.73357: checking to see if all hosts have failed and the running result is not ok 8218 1726776628.73358: done checking to see if all hosts have failed 8218 1726776628.73358: getting the remaining hosts for this loop 8218 1726776628.73359: done getting the remaining hosts for this loop 8218 1726776628.73362: getting the next task for host managed_node2 8218 1726776628.73367: done getting next task for host managed_node2 8218 1726776628.73370: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8218 1726776628.73372: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776628.73382: getting variables 8218 1726776628.73384: in VariableManager get_vars() 8218 1726776628.73408: Calling all_inventory to load vars for managed_node2 8218 1726776628.73410: Calling groups_inventory to load vars for managed_node2 8218 1726776628.73412: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776628.73420: Calling all_plugins_play to load vars for managed_node2 8218 1726776628.73422: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776628.73424: Calling groups_plugins_play to load vars for managed_node2 8218 1726776628.73542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776628.73656: done with get_vars() 8218 1726776628.73664: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.523) 0:00:14.567 **** 8218 1726776628.73733: entering _queue_task() for managed_node2/stat 8218 1726776628.73888: worker is 1 (out of 1 available) 8218 1726776628.73901: exiting _queue_task() for managed_node2/stat 8218 1726776628.73912: done queuing things up, now waiting for results queue to drain 8218 1726776628.73914: waiting for pending results... 8711 1726776628.74025: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8711 1726776628.74128: in run() - task 120fa90a-8a95-cec2-986e-0000000000b6 8711 1726776628.74147: variable 'ansible_search_path' from source: unknown 8711 1726776628.74152: variable 'ansible_search_path' from source: unknown 8711 1726776628.74187: variable '__prof_from_conf' from source: task vars 8711 1726776628.74418: variable '__prof_from_conf' from source: task vars 8711 1726776628.74554: variable '__data' from source: task vars 8711 1726776628.74609: variable '__kernel_settings_register_tuned_main' from source: set_fact 8711 1726776628.74742: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8711 1726776628.74752: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8711 1726776628.74798: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8711 1726776628.74814: variable 'omit' from source: magic vars 8711 1726776628.74886: variable 'ansible_host' from source: host vars for 'managed_node2' 8711 1726776628.74897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8711 1726776628.74970: variable 'omit' from source: magic vars 8711 1726776628.75240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8711 1726776628.76822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8711 1726776628.76878: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8711 1726776628.76906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8711 1726776628.76934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8711 1726776628.76957: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8711 1726776628.77008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8711 1726776628.77030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8711 1726776628.77053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8711 1726776628.77081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8711 1726776628.77091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8711 1726776628.77156: variable 'item' from source: unknown 8711 1726776628.77169: Evaluated conditional (item | length > 0): False 8711 1726776628.77174: when evaluation is False, skipping this task 8711 1726776628.77195: variable 'item' from source: unknown 8711 1726776628.77240: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 8711 1726776628.77309: variable 'ansible_host' from source: host vars for 'managed_node2' 8711 1726776628.77319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8711 1726776628.77328: variable 'omit' from source: magic vars 8711 1726776628.77442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8711 1726776628.77459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8711 1726776628.77477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8711 1726776628.77507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8711 1726776628.77518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8711 1726776628.77580: variable 'item' from source: unknown 8711 1726776628.77593: Evaluated conditional (item | length > 0): True 8711 1726776628.77599: variable 'omit' from source: magic vars 8711 1726776628.77635: variable 'omit' from source: magic vars 8711 1726776628.77677: variable 'item' from source: unknown 8711 1726776628.77737: variable 'item' from source: unknown 8711 1726776628.77752: variable 'omit' from source: magic vars 8711 1726776628.77778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8711 1726776628.77801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8711 1726776628.77816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8711 1726776628.77831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8711 1726776628.77840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8711 1726776628.77863: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8711 1726776628.77867: variable 'ansible_host' from source: host vars for 'managed_node2' 8711 1726776628.77870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8711 1726776628.77953: Set connection var ansible_connection to ssh 8711 1726776628.77961: Set connection var ansible_pipelining to False 8711 1726776628.77967: Set connection var ansible_timeout to 10 8711 1726776628.77976: Set connection var ansible_module_compression to ZIP_DEFLATED 8711 1726776628.77981: Set connection var ansible_shell_type to sh 8711 1726776628.77986: Set connection var ansible_shell_executable to /bin/sh 8711 1726776628.78001: variable 'ansible_shell_executable' from source: unknown 8711 1726776628.78004: variable 'ansible_connection' from source: unknown 8711 1726776628.78007: variable 'ansible_module_compression' from source: unknown 8711 1726776628.78009: variable 'ansible_shell_type' from source: unknown 8711 1726776628.78011: variable 'ansible_shell_executable' from source: unknown 8711 1726776628.78014: variable 'ansible_host' from source: host vars for 'managed_node2' 8711 1726776628.78017: variable 'ansible_pipelining' from source: unknown 8711 1726776628.78019: variable 'ansible_timeout' from source: unknown 8711 1726776628.78022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8711 1726776628.78146: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8711 1726776628.78158: variable 'omit' from source: magic vars 8711 1726776628.78163: starting attempt loop 8711 1726776628.78166: running the handler 8711 1726776628.78180: _low_level_execute_command(): starting 8711 1726776628.78188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8711 1726776628.80796: stdout chunk (state=2): >>>/root <<< 8711 1726776628.80902: stderr chunk (state=3): >>><<< 8711 1726776628.80908: stdout chunk (state=3): >>><<< 8711 1726776628.80923: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8711 1726776628.80935: _low_level_execute_command(): starting 8711 1726776628.80939: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135 `" && echo ansible-tmp-1726776628.8093135-8711-9363824817135="` echo /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135 `" ) && sleep 0' 8711 1726776628.83423: stdout chunk (state=2): >>>ansible-tmp-1726776628.8093135-8711-9363824817135=/root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135 <<< 8711 1726776628.83549: stderr chunk (state=3): >>><<< 8711 1726776628.83556: stdout chunk (state=3): >>><<< 8711 1726776628.83568: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776628.8093135-8711-9363824817135=/root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135 , stderr= 8711 1726776628.83609: variable 'ansible_module_compression' from source: unknown 8711 1726776628.83648: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8711 1726776628.83676: variable 'ansible_facts' from source: unknown 8711 1726776628.83743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135/AnsiballZ_stat.py 8711 1726776628.83837: Sending initial data 8711 1726776628.83844: Sent initial data (149 bytes) 8711 1726776628.86409: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp1dbbf152 /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135/AnsiballZ_stat.py <<< 8711 1726776628.87486: stderr chunk (state=3): >>><<< 8711 1726776628.87492: stdout chunk (state=3): >>><<< 8711 1726776628.87509: done transferring module to remote 8711 1726776628.87518: _low_level_execute_command(): starting 8711 1726776628.87523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135/ /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135/AnsiballZ_stat.py && sleep 0' 8711 1726776628.90071: stderr chunk (state=2): >>><<< 8711 1726776628.90084: stdout chunk (state=2): >>><<< 8711 1726776628.90099: _low_level_execute_command() done: rc=0, stdout=, stderr= 8711 1726776628.90104: _low_level_execute_command(): starting 8711 1726776628.90110: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135/AnsiballZ_stat.py && sleep 0' 8711 1726776629.06161: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8711 1726776629.07212: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8711 1726776629.07258: stderr chunk (state=3): >>><<< 8711 1726776629.07265: stdout chunk (state=3): >>><<< 8711 1726776629.07281: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 8711 1726776629.07303: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8711 1726776629.07314: _low_level_execute_command(): starting 8711 1726776629.07320: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776628.8093135-8711-9363824817135/ > /dev/null 2>&1 && sleep 0' 8711 1726776629.09738: stderr chunk (state=2): >>><<< 8711 1726776629.09745: stdout chunk (state=2): >>><<< 8711 1726776629.09757: _low_level_execute_command() done: rc=0, stdout=, stderr= 8711 1726776629.09763: handler run complete 8711 1726776629.09779: attempt loop complete, returning result 8711 1726776629.09795: variable 'item' from source: unknown 8711 1726776629.09867: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 8711 1726776629.09956: variable 'ansible_host' from source: host vars for 'managed_node2' 8711 1726776629.09966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8711 1726776629.09978: variable 'omit' from source: magic vars 8711 1726776629.10110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8711 1726776629.10138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8711 1726776629.10162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8711 1726776629.10202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8711 1726776629.10216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8711 1726776629.10298: variable 'item' from source: unknown 8711 1726776629.10308: Evaluated conditional (item | length > 0): True 8711 1726776629.10313: variable 'omit' from source: magic vars 8711 1726776629.10330: variable 'omit' from source: magic vars 8711 1726776629.10372: variable 'item' from source: unknown 8711 1726776629.10440: variable 'item' from source: unknown 8711 1726776629.10456: variable 'omit' from source: magic vars 8711 1726776629.10477: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8711 1726776629.10486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8711 1726776629.10492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8711 1726776629.10504: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8711 1726776629.10508: variable 'ansible_host' from source: host vars for 'managed_node2' 8711 1726776629.10511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8711 1726776629.10583: Set connection var ansible_connection to ssh 8711 1726776629.10591: Set connection var ansible_pipelining to False 8711 1726776629.10597: Set connection var ansible_timeout to 10 8711 1726776629.10604: Set connection var ansible_module_compression to ZIP_DEFLATED 8711 1726776629.10609: Set connection var ansible_shell_type to sh 8711 1726776629.10614: Set connection var ansible_shell_executable to /bin/sh 8711 1726776629.10633: variable 'ansible_shell_executable' from source: unknown 8711 1726776629.10638: variable 'ansible_connection' from source: unknown 8711 1726776629.10641: variable 'ansible_module_compression' from source: unknown 8711 1726776629.10643: variable 'ansible_shell_type' from source: unknown 8711 1726776629.10646: variable 'ansible_shell_executable' from source: unknown 8711 1726776629.10648: variable 'ansible_host' from source: host vars for 'managed_node2' 8711 1726776629.10652: variable 'ansible_pipelining' from source: unknown 8711 1726776629.10655: variable 'ansible_timeout' from source: unknown 8711 1726776629.10659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8711 1726776629.10759: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8711 1726776629.10769: variable 'omit' from source: magic vars 8711 1726776629.10778: starting attempt loop 8711 1726776629.10781: running the handler 8711 1726776629.10788: _low_level_execute_command(): starting 8711 1726776629.10792: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8711 1726776629.13388: stdout chunk (state=2): >>>/root <<< 8711 1726776629.13520: stderr chunk (state=3): >>><<< 8711 1726776629.13534: stdout chunk (state=3): >>><<< 8711 1726776629.13555: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8711 1726776629.13570: _low_level_execute_command(): starting 8711 1726776629.13580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289 `" && echo ansible-tmp-1726776629.1356666-8711-140444812795289="` echo /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289 `" ) && sleep 0' 8711 1726776629.16176: stdout chunk (state=2): >>>ansible-tmp-1726776629.1356666-8711-140444812795289=/root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289 <<< 8711 1726776629.16434: stderr chunk (state=3): >>><<< 8711 1726776629.16442: stdout chunk (state=3): >>><<< 8711 1726776629.16458: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776629.1356666-8711-140444812795289=/root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289 , stderr= 8711 1726776629.16499: variable 'ansible_module_compression' from source: unknown 8711 1726776629.16551: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8711 1726776629.16572: variable 'ansible_facts' from source: unknown 8711 1726776629.16654: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289/AnsiballZ_stat.py 8711 1726776629.17098: Sending initial data 8711 1726776629.17106: Sent initial data (151 bytes) 8711 1726776629.19615: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp53kqhjwx /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289/AnsiballZ_stat.py <<< 8711 1726776629.21478: stderr chunk (state=3): >>><<< 8711 1726776629.21487: stdout chunk (state=3): >>><<< 8711 1726776629.21509: done transferring module to remote 8711 1726776629.21519: _low_level_execute_command(): starting 8711 1726776629.21525: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289/ /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289/AnsiballZ_stat.py && sleep 0' 8711 1726776629.24337: stderr chunk (state=2): >>><<< 8711 1726776629.24344: stdout chunk (state=2): >>><<< 8711 1726776629.24358: _low_level_execute_command() done: rc=0, stdout=, stderr= 8711 1726776629.24361: _low_level_execute_command(): starting 8711 1726776629.24365: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289/AnsiballZ_stat.py && sleep 0' 8711 1726776629.40284: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776624.4448986, "mtime": 1726776622.6488929, "ctime": 1726776622.6488929, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8711 1726776629.41478: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8711 1726776629.41520: stderr chunk (state=3): >>><<< 8711 1726776629.41528: stdout chunk (state=3): >>><<< 8711 1726776629.41547: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776624.4448986, "mtime": 1726776622.6488929, "ctime": 1726776622.6488929, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 8711 1726776629.41591: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8711 1726776629.41601: _low_level_execute_command(): starting 8711 1726776629.41607: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776629.1356666-8711-140444812795289/ > /dev/null 2>&1 && sleep 0' 8711 1726776629.44871: stderr chunk (state=2): >>><<< 8711 1726776629.44880: stdout chunk (state=2): >>><<< 8711 1726776629.44897: _low_level_execute_command() done: rc=0, stdout=, stderr= 8711 1726776629.44904: handler run complete 8711 1726776629.44954: attempt loop complete, returning result 8711 1726776629.44978: variable 'item' from source: unknown 8711 1726776629.45062: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776624.4448986, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726776622.6488929, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726776622.6488929, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8711 1726776629.45114: dumping result to json 8711 1726776629.45124: done dumping result, returning 8711 1726776629.45135: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-cec2-986e-0000000000b6] 8711 1726776629.45142: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b6 8711 1726776629.45186: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b6 8711 1726776629.45190: WORKER PROCESS EXITING 8218 1726776629.45830: no more pending results, returning what we have 8218 1726776629.45833: results queue empty 8218 1726776629.45834: checking for any_errors_fatal 8218 1726776629.45839: done checking for any_errors_fatal 8218 1726776629.45840: checking for max_fail_percentage 8218 1726776629.45842: done checking for max_fail_percentage 8218 1726776629.45843: checking to see if all hosts have failed and the running result is not ok 8218 1726776629.45843: done checking to see if all hosts have failed 8218 1726776629.45844: getting the remaining hosts for this loop 8218 1726776629.45845: done getting the remaining hosts for this loop 8218 1726776629.45848: getting the next task for host managed_node2 8218 1726776629.45855: done getting next task for host managed_node2 8218 1726776629.45858: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8218 1726776629.45861: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776629.45871: getting variables 8218 1726776629.45872: in VariableManager get_vars() 8218 1726776629.45902: Calling all_inventory to load vars for managed_node2 8218 1726776629.45905: Calling groups_inventory to load vars for managed_node2 8218 1726776629.45907: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776629.45917: Calling all_plugins_play to load vars for managed_node2 8218 1726776629.45919: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776629.45922: Calling groups_plugins_play to load vars for managed_node2 8218 1726776629.46114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776629.46309: done with get_vars() 8218 1726776629.46319: done getting variables 8218 1726776629.46376: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:10:29 -0400 (0:00:00.726) 0:00:15.294 **** 8218 1726776629.46404: entering _queue_task() for managed_node2/set_fact 8218 1726776629.46606: worker is 1 (out of 1 available) 8218 1726776629.46620: exiting _queue_task() for managed_node2/set_fact 8218 1726776629.46633: done queuing things up, now waiting for results queue to drain 8218 1726776629.46635: waiting for pending results... 8766 1726776629.46857: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8766 1726776629.46979: in run() - task 120fa90a-8a95-cec2-986e-0000000000b7 8766 1726776629.46995: variable 'ansible_search_path' from source: unknown 8766 1726776629.46999: variable 'ansible_search_path' from source: unknown 8766 1726776629.47032: calling self._execute() 8766 1726776629.47105: variable 'ansible_host' from source: host vars for 'managed_node2' 8766 1726776629.47116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8766 1726776629.47125: variable 'omit' from source: magic vars 8766 1726776629.47220: variable 'omit' from source: magic vars 8766 1726776629.47267: variable 'omit' from source: magic vars 8766 1726776629.47605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8766 1726776629.49137: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8766 1726776629.49185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8766 1726776629.49213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8766 1726776629.49295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8766 1726776629.49321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8766 1726776629.49392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8766 1726776629.49419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8766 1726776629.49442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8766 1726776629.49469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8766 1726776629.49483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8766 1726776629.49514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8766 1726776629.49537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8766 1726776629.49554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8766 1726776629.49581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8766 1726776629.49592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8766 1726776629.49645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8766 1726776629.49663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8766 1726776629.49682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8766 1726776629.49707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8766 1726776629.49718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8766 1726776629.49868: variable '__kernel_settings_find_profile_dirs' from source: set_fact 8766 1726776629.49934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8766 1726776629.50064: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8766 1726776629.50100: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8766 1726776629.50128: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8766 1726776629.50158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8766 1726776629.50199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8766 1726776629.50219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8766 1726776629.50243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8766 1726776629.50268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8766 1726776629.50318: variable 'omit' from source: magic vars 8766 1726776629.50369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8766 1726776629.50397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8766 1726776629.50417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8766 1726776629.50435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8766 1726776629.50446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8766 1726776629.50471: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8766 1726776629.50479: variable 'ansible_host' from source: host vars for 'managed_node2' 8766 1726776629.50483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8766 1726776629.50572: Set connection var ansible_connection to ssh 8766 1726776629.50583: Set connection var ansible_pipelining to False 8766 1726776629.50589: Set connection var ansible_timeout to 10 8766 1726776629.50596: Set connection var ansible_module_compression to ZIP_DEFLATED 8766 1726776629.50601: Set connection var ansible_shell_type to sh 8766 1726776629.50606: Set connection var ansible_shell_executable to /bin/sh 8766 1726776629.50625: variable 'ansible_shell_executable' from source: unknown 8766 1726776629.50630: variable 'ansible_connection' from source: unknown 8766 1726776629.50634: variable 'ansible_module_compression' from source: unknown 8766 1726776629.50636: variable 'ansible_shell_type' from source: unknown 8766 1726776629.50638: variable 'ansible_shell_executable' from source: unknown 8766 1726776629.50641: variable 'ansible_host' from source: host vars for 'managed_node2' 8766 1726776629.50644: variable 'ansible_pipelining' from source: unknown 8766 1726776629.50647: variable 'ansible_timeout' from source: unknown 8766 1726776629.50651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8766 1726776629.50741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8766 1726776629.50753: variable 'omit' from source: magic vars 8766 1726776629.50759: starting attempt loop 8766 1726776629.50762: running the handler 8766 1726776629.50772: handler run complete 8766 1726776629.50784: attempt loop complete, returning result 8766 1726776629.50789: _execute() done 8766 1726776629.50792: dumping result to json 8766 1726776629.50795: done dumping result, returning 8766 1726776629.50801: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-cec2-986e-0000000000b7] 8766 1726776629.50809: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b7 8766 1726776629.50833: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b7 8766 1726776629.50837: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8218 1726776629.51110: no more pending results, returning what we have 8218 1726776629.51113: results queue empty 8218 1726776629.51114: checking for any_errors_fatal 8218 1726776629.51121: done checking for any_errors_fatal 8218 1726776629.51122: checking for max_fail_percentage 8218 1726776629.51123: done checking for max_fail_percentage 8218 1726776629.51124: checking to see if all hosts have failed and the running result is not ok 8218 1726776629.51124: done checking to see if all hosts have failed 8218 1726776629.51125: getting the remaining hosts for this loop 8218 1726776629.51126: done getting the remaining hosts for this loop 8218 1726776629.51131: getting the next task for host managed_node2 8218 1726776629.51136: done getting next task for host managed_node2 8218 1726776629.51140: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8218 1726776629.51142: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776629.51152: getting variables 8218 1726776629.51154: in VariableManager get_vars() 8218 1726776629.51183: Calling all_inventory to load vars for managed_node2 8218 1726776629.51186: Calling groups_inventory to load vars for managed_node2 8218 1726776629.51188: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776629.51197: Calling all_plugins_play to load vars for managed_node2 8218 1726776629.51199: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776629.51202: Calling groups_plugins_play to load vars for managed_node2 8218 1726776629.51362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776629.51557: done with get_vars() 8218 1726776629.51565: done getting variables 8218 1726776629.51615: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:10:29 -0400 (0:00:00.052) 0:00:15.346 **** 8218 1726776629.51640: entering _queue_task() for managed_node2/service 8218 1726776629.51793: worker is 1 (out of 1 available) 8218 1726776629.51806: exiting _queue_task() for managed_node2/service 8218 1726776629.51816: done queuing things up, now waiting for results queue to drain 8218 1726776629.51817: waiting for pending results... 8770 1726776629.51931: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8770 1726776629.52030: in run() - task 120fa90a-8a95-cec2-986e-0000000000b8 8770 1726776629.52045: variable 'ansible_search_path' from source: unknown 8770 1726776629.52049: variable 'ansible_search_path' from source: unknown 8770 1726776629.52085: variable '__kernel_settings_services' from source: include_vars 8770 1726776629.52356: variable '__kernel_settings_services' from source: include_vars 8770 1726776629.52410: variable 'omit' from source: magic vars 8770 1726776629.52477: variable 'ansible_host' from source: host vars for 'managed_node2' 8770 1726776629.52488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8770 1726776629.52496: variable 'omit' from source: magic vars 8770 1726776629.52545: variable 'omit' from source: magic vars 8770 1726776629.52570: variable 'omit' from source: magic vars 8770 1726776629.52601: variable 'item' from source: unknown 8770 1726776629.52652: variable 'item' from source: unknown 8770 1726776629.52669: variable 'omit' from source: magic vars 8770 1726776629.52698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8770 1726776629.52723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8770 1726776629.52741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8770 1726776629.52753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8770 1726776629.52760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8770 1726776629.52780: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8770 1726776629.52784: variable 'ansible_host' from source: host vars for 'managed_node2' 8770 1726776629.52787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8770 1726776629.52892: Set connection var ansible_connection to ssh 8770 1726776629.52904: Set connection var ansible_pipelining to False 8770 1726776629.52911: Set connection var ansible_timeout to 10 8770 1726776629.52919: Set connection var ansible_module_compression to ZIP_DEFLATED 8770 1726776629.52924: Set connection var ansible_shell_type to sh 8770 1726776629.52930: Set connection var ansible_shell_executable to /bin/sh 8770 1726776629.52947: variable 'ansible_shell_executable' from source: unknown 8770 1726776629.52951: variable 'ansible_connection' from source: unknown 8770 1726776629.52954: variable 'ansible_module_compression' from source: unknown 8770 1726776629.52956: variable 'ansible_shell_type' from source: unknown 8770 1726776629.52958: variable 'ansible_shell_executable' from source: unknown 8770 1726776629.52961: variable 'ansible_host' from source: host vars for 'managed_node2' 8770 1726776629.52964: variable 'ansible_pipelining' from source: unknown 8770 1726776629.52967: variable 'ansible_timeout' from source: unknown 8770 1726776629.52971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8770 1726776629.53091: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8770 1726776629.53103: variable 'omit' from source: magic vars 8770 1726776629.53109: starting attempt loop 8770 1726776629.53112: running the handler 8770 1726776629.53192: variable 'ansible_facts' from source: unknown 8770 1726776629.53299: _low_level_execute_command(): starting 8770 1726776629.53308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8770 1726776629.56443: stdout chunk (state=2): >>>/root <<< 8770 1726776629.56707: stderr chunk (state=3): >>><<< 8770 1726776629.56715: stdout chunk (state=3): >>><<< 8770 1726776629.56737: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8770 1726776629.56751: _low_level_execute_command(): starting 8770 1726776629.56756: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613 `" && echo ansible-tmp-1726776629.5674508-8770-38889254681613="` echo /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613 `" ) && sleep 0' 8770 1726776629.61335: stdout chunk (state=2): >>>ansible-tmp-1726776629.5674508-8770-38889254681613=/root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613 <<< 8770 1726776629.61471: stderr chunk (state=3): >>><<< 8770 1726776629.61478: stdout chunk (state=3): >>><<< 8770 1726776629.61492: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776629.5674508-8770-38889254681613=/root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613 , stderr= 8770 1726776629.61514: variable 'ansible_module_compression' from source: unknown 8770 1726776629.61567: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8770 1726776629.61622: variable 'ansible_facts' from source: unknown 8770 1726776629.61848: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613/AnsiballZ_systemd.py 8770 1726776629.62319: Sending initial data 8770 1726776629.62326: Sent initial data (153 bytes) 8770 1726776629.64975: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp8qwe7zc7 /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613/AnsiballZ_systemd.py <<< 8770 1726776629.67199: stderr chunk (state=3): >>><<< 8770 1726776629.67205: stdout chunk (state=3): >>><<< 8770 1726776629.67223: done transferring module to remote 8770 1726776629.67235: _low_level_execute_command(): starting 8770 1726776629.67241: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613/ /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613/AnsiballZ_systemd.py && sleep 0' 8770 1726776629.69608: stderr chunk (state=2): >>><<< 8770 1726776629.69616: stdout chunk (state=2): >>><<< 8770 1726776629.69631: _low_level_execute_command() done: rc=0, stdout=, stderr= 8770 1726776629.69635: _low_level_execute_command(): starting 8770 1726776629.69640: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613/AnsiballZ_systemd.py && sleep 0' 8770 1726776629.98041: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:24 EDT", "WatchdogTimestampMonotonic": "235500210", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8410", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ExecMainStartTimestampMonotonic": "235251649", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8410", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:24 EDT] ; stop_time=[n/a] ; pid=8410 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15015936", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:24 EDT", "StateChangeTimestampMonotonic": "235500213", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveExitTimestampMonotonic": "235251858", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveEnterTimestampMonotonic": "235500213", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveExitTimestampMonotonic": "235180357", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveEnterTimestampMonotonic": "235248719", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ConditionTimestampMonotonic": "235250552", "AssertTimestamp": "Thu 2024-09-19 16:10:24 EDT", "AssertTimestampMonotonic": "235250553", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "222bf69699fd489abd35224c3eab4032", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8770 1726776629.99785: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8770 1726776629.99797: stdout chunk (state=3): >>><<< 8770 1726776629.99807: stderr chunk (state=3): >>><<< 8770 1726776629.99825: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:24 EDT", "WatchdogTimestampMonotonic": "235500210", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8410", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ExecMainStartTimestampMonotonic": "235251649", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8410", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:24 EDT] ; stop_time=[n/a] ; pid=8410 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15015936", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:24 EDT", "StateChangeTimestampMonotonic": "235500213", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveExitTimestampMonotonic": "235251858", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveEnterTimestampMonotonic": "235500213", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveExitTimestampMonotonic": "235180357", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveEnterTimestampMonotonic": "235248719", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ConditionTimestampMonotonic": "235250552", "AssertTimestamp": "Thu 2024-09-19 16:10:24 EDT", "AssertTimestampMonotonic": "235250553", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "222bf69699fd489abd35224c3eab4032", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8770 1726776629.99999: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8770 1726776630.00024: _low_level_execute_command(): starting 8770 1726776630.00032: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776629.5674508-8770-38889254681613/ > /dev/null 2>&1 && sleep 0' 8770 1726776630.02678: stderr chunk (state=2): >>><<< 8770 1726776630.02687: stdout chunk (state=2): >>><<< 8770 1726776630.02701: _low_level_execute_command() done: rc=0, stdout=, stderr= 8770 1726776630.02708: handler run complete 8770 1726776630.02746: attempt loop complete, returning result 8770 1726776630.02764: variable 'item' from source: unknown 8770 1726776630.02823: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveEnterTimestampMonotonic": "235500213", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveExitTimestampMonotonic": "235180357", "ActiveState": "active", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:24 EDT", "AssertTimestampMonotonic": "235250553", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ConditionTimestampMonotonic": "235250552", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8410", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ExecMainStartTimestampMonotonic": "235251649", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:24 EDT] ; stop_time=[n/a] ; pid=8410 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveEnterTimestampMonotonic": "235248719", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveExitTimestampMonotonic": "235251858", "InvocationID": "222bf69699fd489abd35224c3eab4032", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8410", "MemoryAccounting": "yes", "MemoryCurrent": "15015936", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:24 EDT", "StateChangeTimestampMonotonic": "235500213", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:24 EDT", "WatchdogTimestampMonotonic": "235500210", "WatchdogUSec": "0" } } 8770 1726776630.02917: dumping result to json 8770 1726776630.02940: done dumping result, returning 8770 1726776630.02948: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-cec2-986e-0000000000b8] 8770 1726776630.02955: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b8 8770 1726776630.03062: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b8 8770 1726776630.03067: WORKER PROCESS EXITING 8218 1726776630.03404: no more pending results, returning what we have 8218 1726776630.03408: results queue empty 8218 1726776630.03409: checking for any_errors_fatal 8218 1726776630.03412: done checking for any_errors_fatal 8218 1726776630.03413: checking for max_fail_percentage 8218 1726776630.03414: done checking for max_fail_percentage 8218 1726776630.03414: checking to see if all hosts have failed and the running result is not ok 8218 1726776630.03415: done checking to see if all hosts have failed 8218 1726776630.03415: getting the remaining hosts for this loop 8218 1726776630.03416: done getting the remaining hosts for this loop 8218 1726776630.03422: getting the next task for host managed_node2 8218 1726776630.03429: done getting next task for host managed_node2 8218 1726776630.03434: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8218 1726776630.03436: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776630.03447: getting variables 8218 1726776630.03449: in VariableManager get_vars() 8218 1726776630.03474: Calling all_inventory to load vars for managed_node2 8218 1726776630.03477: Calling groups_inventory to load vars for managed_node2 8218 1726776630.03478: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776630.03487: Calling all_plugins_play to load vars for managed_node2 8218 1726776630.03489: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776630.03491: Calling groups_plugins_play to load vars for managed_node2 8218 1726776630.03638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776630.03836: done with get_vars() 8218 1726776630.03847: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:10:30 -0400 (0:00:00.522) 0:00:15.869 **** 8218 1726776630.03935: entering _queue_task() for managed_node2/file 8218 1726776630.04128: worker is 1 (out of 1 available) 8218 1726776630.04145: exiting _queue_task() for managed_node2/file 8218 1726776630.04157: done queuing things up, now waiting for results queue to drain 8218 1726776630.04158: waiting for pending results... 8804 1726776630.04361: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8804 1726776630.04468: in run() - task 120fa90a-8a95-cec2-986e-0000000000b9 8804 1726776630.04485: variable 'ansible_search_path' from source: unknown 8804 1726776630.04490: variable 'ansible_search_path' from source: unknown 8804 1726776630.04518: calling self._execute() 8804 1726776630.04585: variable 'ansible_host' from source: host vars for 'managed_node2' 8804 1726776630.04593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8804 1726776630.04602: variable 'omit' from source: magic vars 8804 1726776630.04673: variable 'omit' from source: magic vars 8804 1726776630.04709: variable 'omit' from source: magic vars 8804 1726776630.04731: variable '__kernel_settings_profile_dir' from source: role '' all vars 8804 1726776630.04941: variable '__kernel_settings_profile_dir' from source: role '' all vars 8804 1726776630.05013: variable '__kernel_settings_profile_parent' from source: set_fact 8804 1726776630.05022: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8804 1726776630.05081: variable 'omit' from source: magic vars 8804 1726776630.05114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8804 1726776630.05142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8804 1726776630.05161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8804 1726776630.05175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8804 1726776630.05189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8804 1726776630.05212: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8804 1726776630.05217: variable 'ansible_host' from source: host vars for 'managed_node2' 8804 1726776630.05222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8804 1726776630.05306: Set connection var ansible_connection to ssh 8804 1726776630.05316: Set connection var ansible_pipelining to False 8804 1726776630.05322: Set connection var ansible_timeout to 10 8804 1726776630.05331: Set connection var ansible_module_compression to ZIP_DEFLATED 8804 1726776630.05337: Set connection var ansible_shell_type to sh 8804 1726776630.05340: Set connection var ansible_shell_executable to /bin/sh 8804 1726776630.05354: variable 'ansible_shell_executable' from source: unknown 8804 1726776630.05357: variable 'ansible_connection' from source: unknown 8804 1726776630.05359: variable 'ansible_module_compression' from source: unknown 8804 1726776630.05361: variable 'ansible_shell_type' from source: unknown 8804 1726776630.05363: variable 'ansible_shell_executable' from source: unknown 8804 1726776630.05364: variable 'ansible_host' from source: host vars for 'managed_node2' 8804 1726776630.05367: variable 'ansible_pipelining' from source: unknown 8804 1726776630.05368: variable 'ansible_timeout' from source: unknown 8804 1726776630.05370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8804 1726776630.05506: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8804 1726776630.05515: variable 'omit' from source: magic vars 8804 1726776630.05520: starting attempt loop 8804 1726776630.05522: running the handler 8804 1726776630.05533: _low_level_execute_command(): starting 8804 1726776630.05539: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8804 1726776630.07907: stdout chunk (state=2): >>>/root <<< 8804 1726776630.08035: stderr chunk (state=3): >>><<< 8804 1726776630.08042: stdout chunk (state=3): >>><<< 8804 1726776630.08061: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8804 1726776630.08074: _low_level_execute_command(): starting 8804 1726776630.08080: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915 `" && echo ansible-tmp-1726776630.0806944-8804-90157534355915="` echo /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915 `" ) && sleep 0' 8804 1726776630.10567: stdout chunk (state=2): >>>ansible-tmp-1726776630.0806944-8804-90157534355915=/root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915 <<< 8804 1726776630.10702: stderr chunk (state=3): >>><<< 8804 1726776630.10708: stdout chunk (state=3): >>><<< 8804 1726776630.10723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776630.0806944-8804-90157534355915=/root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915 , stderr= 8804 1726776630.10761: variable 'ansible_module_compression' from source: unknown 8804 1726776630.10809: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 8804 1726776630.10840: variable 'ansible_facts' from source: unknown 8804 1726776630.10910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915/AnsiballZ_file.py 8804 1726776630.11012: Sending initial data 8804 1726776630.11019: Sent initial data (150 bytes) 8804 1726776630.13713: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpd3y84596 /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915/AnsiballZ_file.py <<< 8804 1726776630.15138: stderr chunk (state=3): >>><<< 8804 1726776630.15148: stdout chunk (state=3): >>><<< 8804 1726776630.15168: done transferring module to remote 8804 1726776630.15181: _low_level_execute_command(): starting 8804 1726776630.15187: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915/ /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915/AnsiballZ_file.py && sleep 0' 8804 1726776630.17654: stderr chunk (state=2): >>><<< 8804 1726776630.17671: stdout chunk (state=2): >>><<< 8804 1726776630.17691: _low_level_execute_command() done: rc=0, stdout=, stderr= 8804 1726776630.17697: _low_level_execute_command(): starting 8804 1726776630.17705: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915/AnsiballZ_file.py && sleep 0' 8804 1726776630.33959: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8804 1726776630.35173: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8804 1726776630.35185: stdout chunk (state=3): >>><<< 8804 1726776630.35196: stderr chunk (state=3): >>><<< 8804 1726776630.35210: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8804 1726776630.35255: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8804 1726776630.35268: _low_level_execute_command(): starting 8804 1726776630.35274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776630.0806944-8804-90157534355915/ > /dev/null 2>&1 && sleep 0' 8804 1726776630.38222: stderr chunk (state=2): >>><<< 8804 1726776630.38234: stdout chunk (state=2): >>><<< 8804 1726776630.38249: _low_level_execute_command() done: rc=0, stdout=, stderr= 8804 1726776630.38257: handler run complete 8804 1726776630.38276: attempt loop complete, returning result 8804 1726776630.38280: _execute() done 8804 1726776630.38283: dumping result to json 8804 1726776630.38289: done dumping result, returning 8804 1726776630.38296: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-cec2-986e-0000000000b9] 8804 1726776630.38303: sending task result for task 120fa90a-8a95-cec2-986e-0000000000b9 8804 1726776630.38345: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000b9 8804 1726776630.38353: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8218 1726776630.38652: no more pending results, returning what we have 8218 1726776630.38654: results queue empty 8218 1726776630.38654: checking for any_errors_fatal 8218 1726776630.38663: done checking for any_errors_fatal 8218 1726776630.38664: checking for max_fail_percentage 8218 1726776630.38665: done checking for max_fail_percentage 8218 1726776630.38665: checking to see if all hosts have failed and the running result is not ok 8218 1726776630.38666: done checking to see if all hosts have failed 8218 1726776630.38666: getting the remaining hosts for this loop 8218 1726776630.38667: done getting the remaining hosts for this loop 8218 1726776630.38670: getting the next task for host managed_node2 8218 1726776630.38673: done getting next task for host managed_node2 8218 1726776630.38676: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8218 1726776630.38679: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776630.38686: getting variables 8218 1726776630.38687: in VariableManager get_vars() 8218 1726776630.38710: Calling all_inventory to load vars for managed_node2 8218 1726776630.38712: Calling groups_inventory to load vars for managed_node2 8218 1726776630.38714: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776630.38720: Calling all_plugins_play to load vars for managed_node2 8218 1726776630.38721: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776630.38723: Calling groups_plugins_play to load vars for managed_node2 8218 1726776630.38833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776630.38953: done with get_vars() 8218 1726776630.38962: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:10:30 -0400 (0:00:00.350) 0:00:16.220 **** 8218 1726776630.39027: entering _queue_task() for managed_node2/slurp 8218 1726776630.39195: worker is 1 (out of 1 available) 8218 1726776630.39209: exiting _queue_task() for managed_node2/slurp 8218 1726776630.39220: done queuing things up, now waiting for results queue to drain 8218 1726776630.39222: waiting for pending results... 8834 1726776630.39337: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8834 1726776630.39436: in run() - task 120fa90a-8a95-cec2-986e-0000000000ba 8834 1726776630.39454: variable 'ansible_search_path' from source: unknown 8834 1726776630.39459: variable 'ansible_search_path' from source: unknown 8834 1726776630.39487: calling self._execute() 8834 1726776630.39550: variable 'ansible_host' from source: host vars for 'managed_node2' 8834 1726776630.39560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8834 1726776630.39568: variable 'omit' from source: magic vars 8834 1726776630.39639: variable 'omit' from source: magic vars 8834 1726776630.39679: variable 'omit' from source: magic vars 8834 1726776630.39698: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8834 1726776630.39906: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8834 1726776630.39963: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8834 1726776630.39996: variable 'omit' from source: magic vars 8834 1726776630.40027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8834 1726776630.40061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8834 1726776630.40080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8834 1726776630.40096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8834 1726776630.40108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8834 1726776630.40135: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8834 1726776630.40140: variable 'ansible_host' from source: host vars for 'managed_node2' 8834 1726776630.40145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8834 1726776630.40213: Set connection var ansible_connection to ssh 8834 1726776630.40222: Set connection var ansible_pipelining to False 8834 1726776630.40230: Set connection var ansible_timeout to 10 8834 1726776630.40238: Set connection var ansible_module_compression to ZIP_DEFLATED 8834 1726776630.40243: Set connection var ansible_shell_type to sh 8834 1726776630.40248: Set connection var ansible_shell_executable to /bin/sh 8834 1726776630.40264: variable 'ansible_shell_executable' from source: unknown 8834 1726776630.40267: variable 'ansible_connection' from source: unknown 8834 1726776630.40330: variable 'ansible_module_compression' from source: unknown 8834 1726776630.40337: variable 'ansible_shell_type' from source: unknown 8834 1726776630.40341: variable 'ansible_shell_executable' from source: unknown 8834 1726776630.40345: variable 'ansible_host' from source: host vars for 'managed_node2' 8834 1726776630.40349: variable 'ansible_pipelining' from source: unknown 8834 1726776630.40353: variable 'ansible_timeout' from source: unknown 8834 1726776630.40357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8834 1726776630.40495: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8834 1726776630.40506: variable 'omit' from source: magic vars 8834 1726776630.40511: starting attempt loop 8834 1726776630.40515: running the handler 8834 1726776630.40527: _low_level_execute_command(): starting 8834 1726776630.40538: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8834 1726776630.42917: stdout chunk (state=2): >>>/root <<< 8834 1726776630.43044: stderr chunk (state=3): >>><<< 8834 1726776630.43052: stdout chunk (state=3): >>><<< 8834 1726776630.43072: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8834 1726776630.43086: _low_level_execute_command(): starting 8834 1726776630.43093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772 `" && echo ansible-tmp-1726776630.4308045-8834-2666638305772="` echo /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772 `" ) && sleep 0' 8834 1726776630.45621: stdout chunk (state=2): >>>ansible-tmp-1726776630.4308045-8834-2666638305772=/root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772 <<< 8834 1726776630.45754: stderr chunk (state=3): >>><<< 8834 1726776630.45762: stdout chunk (state=3): >>><<< 8834 1726776630.45780: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776630.4308045-8834-2666638305772=/root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772 , stderr= 8834 1726776630.45823: variable 'ansible_module_compression' from source: unknown 8834 1726776630.45859: ANSIBALLZ: Using lock for slurp 8834 1726776630.45864: ANSIBALLZ: Acquiring lock 8834 1726776630.45867: ANSIBALLZ: Lock acquired: 140571206408464 8834 1726776630.45871: ANSIBALLZ: Creating module 8834 1726776630.56948: ANSIBALLZ: Writing module into payload 8834 1726776630.57025: ANSIBALLZ: Writing module 8834 1726776630.57055: ANSIBALLZ: Renaming module 8834 1726776630.57064: ANSIBALLZ: Done creating module 8834 1726776630.57081: variable 'ansible_facts' from source: unknown 8834 1726776630.57145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772/AnsiballZ_slurp.py 8834 1726776630.57251: Sending initial data 8834 1726776630.57258: Sent initial data (150 bytes) 8834 1726776630.60200: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp4ausa1ux /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772/AnsiballZ_slurp.py <<< 8834 1726776630.61349: stderr chunk (state=3): >>><<< 8834 1726776630.61357: stdout chunk (state=3): >>><<< 8834 1726776630.61373: done transferring module to remote 8834 1726776630.61386: _low_level_execute_command(): starting 8834 1726776630.61391: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772/ /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772/AnsiballZ_slurp.py && sleep 0' 8834 1726776630.63972: stderr chunk (state=2): >>><<< 8834 1726776630.63986: stdout chunk (state=2): >>><<< 8834 1726776630.64005: _low_level_execute_command() done: rc=0, stdout=, stderr= 8834 1726776630.64009: _low_level_execute_command(): starting 8834 1726776630.64015: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772/AnsiballZ_slurp.py && sleep 0' 8834 1726776630.79331: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8834 1726776630.80382: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8834 1726776630.80432: stderr chunk (state=3): >>><<< 8834 1726776630.80439: stdout chunk (state=3): >>><<< 8834 1726776630.80455: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.12.75 closed. 8834 1726776630.80482: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8834 1726776630.80493: _low_level_execute_command(): starting 8834 1726776630.80499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776630.4308045-8834-2666638305772/ > /dev/null 2>&1 && sleep 0' 8834 1726776630.82943: stderr chunk (state=2): >>><<< 8834 1726776630.82951: stdout chunk (state=2): >>><<< 8834 1726776630.82966: _low_level_execute_command() done: rc=0, stdout=, stderr= 8834 1726776630.82973: handler run complete 8834 1726776630.82988: attempt loop complete, returning result 8834 1726776630.82992: _execute() done 8834 1726776630.82995: dumping result to json 8834 1726776630.82999: done dumping result, returning 8834 1726776630.83007: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-cec2-986e-0000000000ba] 8834 1726776630.83013: sending task result for task 120fa90a-8a95-cec2-986e-0000000000ba 8834 1726776630.83042: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000ba 8834 1726776630.83046: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdAo=", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8218 1726776630.83180: no more pending results, returning what we have 8218 1726776630.83183: results queue empty 8218 1726776630.83184: checking for any_errors_fatal 8218 1726776630.83193: done checking for any_errors_fatal 8218 1726776630.83193: checking for max_fail_percentage 8218 1726776630.83195: done checking for max_fail_percentage 8218 1726776630.83195: checking to see if all hosts have failed and the running result is not ok 8218 1726776630.83196: done checking to see if all hosts have failed 8218 1726776630.83197: getting the remaining hosts for this loop 8218 1726776630.83198: done getting the remaining hosts for this loop 8218 1726776630.83200: getting the next task for host managed_node2 8218 1726776630.83205: done getting next task for host managed_node2 8218 1726776630.83208: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8218 1726776630.83210: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776630.83218: getting variables 8218 1726776630.83220: in VariableManager get_vars() 8218 1726776630.83253: Calling all_inventory to load vars for managed_node2 8218 1726776630.83256: Calling groups_inventory to load vars for managed_node2 8218 1726776630.83258: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776630.83267: Calling all_plugins_play to load vars for managed_node2 8218 1726776630.83269: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776630.83272: Calling groups_plugins_play to load vars for managed_node2 8218 1726776630.83399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776630.83557: done with get_vars() 8218 1726776630.83564: done getting variables 8218 1726776630.83607: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:10:30 -0400 (0:00:00.445) 0:00:16.666 **** 8218 1726776630.83631: entering _queue_task() for managed_node2/set_fact 8218 1726776630.83793: worker is 1 (out of 1 available) 8218 1726776630.83807: exiting _queue_task() for managed_node2/set_fact 8218 1726776630.83817: done queuing things up, now waiting for results queue to drain 8218 1726776630.83819: waiting for pending results... 8857 1726776630.83930: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8857 1726776630.84034: in run() - task 120fa90a-8a95-cec2-986e-0000000000bb 8857 1726776630.84050: variable 'ansible_search_path' from source: unknown 8857 1726776630.84054: variable 'ansible_search_path' from source: unknown 8857 1726776630.84078: calling self._execute() 8857 1726776630.84135: variable 'ansible_host' from source: host vars for 'managed_node2' 8857 1726776630.84144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8857 1726776630.84150: variable 'omit' from source: magic vars 8857 1726776630.84216: variable 'omit' from source: magic vars 8857 1726776630.84260: variable 'omit' from source: magic vars 8857 1726776630.84547: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8857 1726776630.84557: variable '__cur_profile' from source: task vars 8857 1726776630.84662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8857 1726776630.86200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8857 1726776630.86270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8857 1726776630.86302: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8857 1726776630.86330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8857 1726776630.86350: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8857 1726776630.86401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8857 1726776630.86419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8857 1726776630.86440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8857 1726776630.86466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8857 1726776630.86474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8857 1726776630.86552: variable '__kernel_settings_tuned_current_profile' from source: set_fact 8857 1726776630.86589: variable 'omit' from source: magic vars 8857 1726776630.86609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8857 1726776630.86626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8857 1726776630.86646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8857 1726776630.86661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8857 1726776630.86671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8857 1726776630.86697: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8857 1726776630.86702: variable 'ansible_host' from source: host vars for 'managed_node2' 8857 1726776630.86706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8857 1726776630.86797: Set connection var ansible_connection to ssh 8857 1726776630.86805: Set connection var ansible_pipelining to False 8857 1726776630.86811: Set connection var ansible_timeout to 10 8857 1726776630.86818: Set connection var ansible_module_compression to ZIP_DEFLATED 8857 1726776630.86824: Set connection var ansible_shell_type to sh 8857 1726776630.86832: Set connection var ansible_shell_executable to /bin/sh 8857 1726776630.86853: variable 'ansible_shell_executable' from source: unknown 8857 1726776630.86858: variable 'ansible_connection' from source: unknown 8857 1726776630.86861: variable 'ansible_module_compression' from source: unknown 8857 1726776630.86864: variable 'ansible_shell_type' from source: unknown 8857 1726776630.86866: variable 'ansible_shell_executable' from source: unknown 8857 1726776630.86869: variable 'ansible_host' from source: host vars for 'managed_node2' 8857 1726776630.86872: variable 'ansible_pipelining' from source: unknown 8857 1726776630.86875: variable 'ansible_timeout' from source: unknown 8857 1726776630.86879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8857 1726776630.86976: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8857 1726776630.86991: variable 'omit' from source: magic vars 8857 1726776630.86998: starting attempt loop 8857 1726776630.87001: running the handler 8857 1726776630.87011: handler run complete 8857 1726776630.87020: attempt loop complete, returning result 8857 1726776630.87023: _execute() done 8857 1726776630.87025: dumping result to json 8857 1726776630.87030: done dumping result, returning 8857 1726776630.87038: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-cec2-986e-0000000000bb] 8857 1726776630.87045: sending task result for task 120fa90a-8a95-cec2-986e-0000000000bb 8857 1726776630.87068: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000bb 8857 1726776630.87072: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8218 1726776630.87523: no more pending results, returning what we have 8218 1726776630.87526: results queue empty 8218 1726776630.87527: checking for any_errors_fatal 8218 1726776630.87534: done checking for any_errors_fatal 8218 1726776630.87534: checking for max_fail_percentage 8218 1726776630.87535: done checking for max_fail_percentage 8218 1726776630.87536: checking to see if all hosts have failed and the running result is not ok 8218 1726776630.87536: done checking to see if all hosts have failed 8218 1726776630.87537: getting the remaining hosts for this loop 8218 1726776630.87537: done getting the remaining hosts for this loop 8218 1726776630.87540: getting the next task for host managed_node2 8218 1726776630.87544: done getting next task for host managed_node2 8218 1726776630.87547: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8218 1726776630.87548: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776630.87559: getting variables 8218 1726776630.87559: in VariableManager get_vars() 8218 1726776630.87586: Calling all_inventory to load vars for managed_node2 8218 1726776630.87588: Calling groups_inventory to load vars for managed_node2 8218 1726776630.87589: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776630.87595: Calling all_plugins_play to load vars for managed_node2 8218 1726776630.87597: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776630.87599: Calling groups_plugins_play to load vars for managed_node2 8218 1726776630.87708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776630.87826: done with get_vars() 8218 1726776630.87837: done getting variables 8218 1726776630.87880: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:10:30 -0400 (0:00:00.042) 0:00:16.709 **** 8218 1726776630.87903: entering _queue_task() for managed_node2/copy 8218 1726776630.88061: worker is 1 (out of 1 available) 8218 1726776630.88076: exiting _queue_task() for managed_node2/copy 8218 1726776630.88090: done queuing things up, now waiting for results queue to drain 8218 1726776630.88091: waiting for pending results... 8859 1726776630.88206: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8859 1726776630.88304: in run() - task 120fa90a-8a95-cec2-986e-0000000000bc 8859 1726776630.88321: variable 'ansible_search_path' from source: unknown 8859 1726776630.88325: variable 'ansible_search_path' from source: unknown 8859 1726776630.88353: calling self._execute() 8859 1726776630.88410: variable 'ansible_host' from source: host vars for 'managed_node2' 8859 1726776630.88419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8859 1726776630.88427: variable 'omit' from source: magic vars 8859 1726776630.88500: variable 'omit' from source: magic vars 8859 1726776630.88535: variable 'omit' from source: magic vars 8859 1726776630.88556: variable '__kernel_settings_active_profile' from source: set_fact 8859 1726776630.88763: variable '__kernel_settings_active_profile' from source: set_fact 8859 1726776630.88784: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8859 1726776630.88836: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8859 1726776630.88889: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8859 1726776630.88963: variable 'omit' from source: magic vars 8859 1726776630.88997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8859 1726776630.89023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8859 1726776630.89045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8859 1726776630.89059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8859 1726776630.89068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8859 1726776630.89091: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8859 1726776630.89096: variable 'ansible_host' from source: host vars for 'managed_node2' 8859 1726776630.89101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8859 1726776630.89169: Set connection var ansible_connection to ssh 8859 1726776630.89176: Set connection var ansible_pipelining to False 8859 1726776630.89183: Set connection var ansible_timeout to 10 8859 1726776630.89191: Set connection var ansible_module_compression to ZIP_DEFLATED 8859 1726776630.89196: Set connection var ansible_shell_type to sh 8859 1726776630.89201: Set connection var ansible_shell_executable to /bin/sh 8859 1726776630.89216: variable 'ansible_shell_executable' from source: unknown 8859 1726776630.89220: variable 'ansible_connection' from source: unknown 8859 1726776630.89224: variable 'ansible_module_compression' from source: unknown 8859 1726776630.89227: variable 'ansible_shell_type' from source: unknown 8859 1726776630.89232: variable 'ansible_shell_executable' from source: unknown 8859 1726776630.89236: variable 'ansible_host' from source: host vars for 'managed_node2' 8859 1726776630.89240: variable 'ansible_pipelining' from source: unknown 8859 1726776630.89243: variable 'ansible_timeout' from source: unknown 8859 1726776630.89246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8859 1726776630.89332: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8859 1726776630.89345: variable 'omit' from source: magic vars 8859 1726776630.89351: starting attempt loop 8859 1726776630.89354: running the handler 8859 1726776630.89366: _low_level_execute_command(): starting 8859 1726776630.89382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8859 1726776630.92138: stdout chunk (state=2): >>>/root <<< 8859 1726776630.92392: stderr chunk (state=3): >>><<< 8859 1726776630.92401: stdout chunk (state=3): >>><<< 8859 1726776630.92424: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8859 1726776630.92442: _low_level_execute_command(): starting 8859 1726776630.92449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357 `" && echo ansible-tmp-1726776630.9243557-8859-148096825111357="` echo /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357 `" ) && sleep 0' 8859 1726776630.95211: stdout chunk (state=2): >>>ansible-tmp-1726776630.9243557-8859-148096825111357=/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357 <<< 8859 1726776630.95260: stderr chunk (state=3): >>><<< 8859 1726776630.95267: stdout chunk (state=3): >>><<< 8859 1726776630.95290: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776630.9243557-8859-148096825111357=/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357 , stderr= 8859 1726776630.95367: variable 'ansible_module_compression' from source: unknown 8859 1726776630.95415: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8859 1726776630.95445: variable 'ansible_facts' from source: unknown 8859 1726776630.95513: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_stat.py 8859 1726776630.95599: Sending initial data 8859 1726776630.95606: Sent initial data (151 bytes) 8859 1726776630.98346: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpe2ze05b_ /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_stat.py <<< 8859 1726776630.99506: stderr chunk (state=3): >>><<< 8859 1726776630.99515: stdout chunk (state=3): >>><<< 8859 1726776630.99537: done transferring module to remote 8859 1726776630.99548: _low_level_execute_command(): starting 8859 1726776630.99553: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/ /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_stat.py && sleep 0' 8859 1726776631.02700: stderr chunk (state=2): >>><<< 8859 1726776631.02710: stdout chunk (state=2): >>><<< 8859 1726776631.02727: _low_level_execute_command() done: rc=0, stdout=, stderr= 8859 1726776631.02734: _low_level_execute_command(): starting 8859 1726776631.02744: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_stat.py && sleep 0' 8859 1726776631.19091: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726776630.7929194, "mtime": 1726776624.4688988, "ctime": 1726776624.4688988, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8859 1726776631.20291: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8859 1726776631.20341: stderr chunk (state=3): >>><<< 8859 1726776631.20349: stdout chunk (state=3): >>><<< 8859 1726776631.20367: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726776630.7929194, "mtime": 1726776624.4688988, "ctime": 1726776624.4688988, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 8859 1726776631.20408: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8859 1726776631.20498: Sending initial data 8859 1726776631.20505: Sent initial data (140 bytes) 8859 1726776631.23070: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp5k177r1x /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source <<< 8859 1726776631.23409: stderr chunk (state=3): >>><<< 8859 1726776631.23415: stdout chunk (state=3): >>><<< 8859 1726776631.23435: _low_level_execute_command(): starting 8859 1726776631.23441: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/ /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source && sleep 0' 8859 1726776631.25810: stderr chunk (state=2): >>><<< 8859 1726776631.25817: stdout chunk (state=2): >>><<< 8859 1726776631.25832: _low_level_execute_command() done: rc=0, stdout=, stderr= 8859 1726776631.25853: variable 'ansible_module_compression' from source: unknown 8859 1726776631.25888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8859 1726776631.25908: variable 'ansible_facts' from source: unknown 8859 1726776631.25966: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_copy.py 8859 1726776631.26052: Sending initial data 8859 1726776631.26059: Sent initial data (151 bytes) 8859 1726776631.28534: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmppcaqki2w /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_copy.py <<< 8859 1726776631.29625: stderr chunk (state=3): >>><<< 8859 1726776631.29633: stdout chunk (state=3): >>><<< 8859 1726776631.29652: done transferring module to remote 8859 1726776631.29661: _low_level_execute_command(): starting 8859 1726776631.29666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/ /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_copy.py && sleep 0' 8859 1726776631.32022: stderr chunk (state=2): >>><<< 8859 1726776631.32031: stdout chunk (state=2): >>><<< 8859 1726776631.32044: _low_level_execute_command() done: rc=0, stdout=, stderr= 8859 1726776631.32049: _low_level_execute_command(): starting 8859 1726776631.32054: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/AnsiballZ_copy.py && sleep 0' 8859 1726776631.48256: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source", "_original_basename": "tmp5k177r1x", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8859 1726776631.49473: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8859 1726776631.49518: stderr chunk (state=3): >>><<< 8859 1726776631.49525: stdout chunk (state=3): >>><<< 8859 1726776631.49542: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source", "_original_basename": "tmp5k177r1x", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8859 1726776631.49568: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source', '_original_basename': 'tmp5k177r1x', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8859 1726776631.49579: _low_level_execute_command(): starting 8859 1726776631.49585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/ > /dev/null 2>&1 && sleep 0' 8859 1726776631.52164: stderr chunk (state=2): >>><<< 8859 1726776631.52173: stdout chunk (state=2): >>><<< 8859 1726776631.52186: _low_level_execute_command() done: rc=0, stdout=, stderr= 8859 1726776631.52194: handler run complete 8859 1726776631.52214: attempt loop complete, returning result 8859 1726776631.52218: _execute() done 8859 1726776631.52220: dumping result to json 8859 1726776631.52226: done dumping result, returning 8859 1726776631.52234: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-cec2-986e-0000000000bc] 8859 1726776631.52240: sending task result for task 120fa90a-8a95-cec2-986e-0000000000bc 8859 1726776631.52271: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000bc 8859 1726776631.52275: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "src": "/root/.ansible/tmp/ansible-tmp-1726776630.9243557-8859-148096825111357/source", "state": "file", "uid": 0 } 8218 1726776631.52433: no more pending results, returning what we have 8218 1726776631.52436: results queue empty 8218 1726776631.52437: checking for any_errors_fatal 8218 1726776631.52442: done checking for any_errors_fatal 8218 1726776631.52443: checking for max_fail_percentage 8218 1726776631.52445: done checking for max_fail_percentage 8218 1726776631.52445: checking to see if all hosts have failed and the running result is not ok 8218 1726776631.52446: done checking to see if all hosts have failed 8218 1726776631.52447: getting the remaining hosts for this loop 8218 1726776631.52447: done getting the remaining hosts for this loop 8218 1726776631.52450: getting the next task for host managed_node2 8218 1726776631.52456: done getting next task for host managed_node2 8218 1726776631.52460: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8218 1726776631.52463: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776631.52471: getting variables 8218 1726776631.52473: in VariableManager get_vars() 8218 1726776631.52504: Calling all_inventory to load vars for managed_node2 8218 1726776631.52507: Calling groups_inventory to load vars for managed_node2 8218 1726776631.52508: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776631.52517: Calling all_plugins_play to load vars for managed_node2 8218 1726776631.52519: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776631.52522: Calling groups_plugins_play to load vars for managed_node2 8218 1726776631.52664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776631.52781: done with get_vars() 8218 1726776631.52789: done getting variables 8218 1726776631.52830: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:10:31 -0400 (0:00:00.649) 0:00:17.359 **** 8218 1726776631.52852: entering _queue_task() for managed_node2/copy 8218 1726776631.53005: worker is 1 (out of 1 available) 8218 1726776631.53018: exiting _queue_task() for managed_node2/copy 8218 1726776631.53031: done queuing things up, now waiting for results queue to drain 8218 1726776631.53033: waiting for pending results... 8915 1726776631.53146: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8915 1726776631.53245: in run() - task 120fa90a-8a95-cec2-986e-0000000000bd 8915 1726776631.53261: variable 'ansible_search_path' from source: unknown 8915 1726776631.53264: variable 'ansible_search_path' from source: unknown 8915 1726776631.53288: calling self._execute() 8915 1726776631.53352: variable 'ansible_host' from source: host vars for 'managed_node2' 8915 1726776631.53360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8915 1726776631.53368: variable 'omit' from source: magic vars 8915 1726776631.53438: variable 'omit' from source: magic vars 8915 1726776631.53470: variable 'omit' from source: magic vars 8915 1726776631.53490: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8915 1726776631.53705: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8915 1726776631.53765: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8915 1726776631.53796: variable 'omit' from source: magic vars 8915 1726776631.53830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8915 1726776631.53856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8915 1726776631.53874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8915 1726776631.53889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8915 1726776631.53901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8915 1726776631.53924: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8915 1726776631.53927: variable 'ansible_host' from source: host vars for 'managed_node2' 8915 1726776631.53932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8915 1726776631.53994: Set connection var ansible_connection to ssh 8915 1726776631.54001: Set connection var ansible_pipelining to False 8915 1726776631.54005: Set connection var ansible_timeout to 10 8915 1726776631.54010: Set connection var ansible_module_compression to ZIP_DEFLATED 8915 1726776631.54014: Set connection var ansible_shell_type to sh 8915 1726776631.54017: Set connection var ansible_shell_executable to /bin/sh 8915 1726776631.54046: variable 'ansible_shell_executable' from source: unknown 8915 1726776631.54051: variable 'ansible_connection' from source: unknown 8915 1726776631.54054: variable 'ansible_module_compression' from source: unknown 8915 1726776631.54057: variable 'ansible_shell_type' from source: unknown 8915 1726776631.54061: variable 'ansible_shell_executable' from source: unknown 8915 1726776631.54064: variable 'ansible_host' from source: host vars for 'managed_node2' 8915 1726776631.54068: variable 'ansible_pipelining' from source: unknown 8915 1726776631.54071: variable 'ansible_timeout' from source: unknown 8915 1726776631.54075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8915 1726776631.54167: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8915 1726776631.54179: variable 'omit' from source: magic vars 8915 1726776631.54185: starting attempt loop 8915 1726776631.54189: running the handler 8915 1726776631.54200: _low_level_execute_command(): starting 8915 1726776631.54207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8915 1726776631.56560: stdout chunk (state=2): >>>/root <<< 8915 1726776631.56677: stderr chunk (state=3): >>><<< 8915 1726776631.56686: stdout chunk (state=3): >>><<< 8915 1726776631.56704: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8915 1726776631.56716: _low_level_execute_command(): starting 8915 1726776631.56722: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605 `" && echo ansible-tmp-1726776631.567117-8915-164769422567605="` echo /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605 `" ) && sleep 0' 8915 1726776631.59270: stdout chunk (state=2): >>>ansible-tmp-1726776631.567117-8915-164769422567605=/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605 <<< 8915 1726776631.59401: stderr chunk (state=3): >>><<< 8915 1726776631.59407: stdout chunk (state=3): >>><<< 8915 1726776631.59420: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776631.567117-8915-164769422567605=/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605 , stderr= 8915 1726776631.59489: variable 'ansible_module_compression' from source: unknown 8915 1726776631.59533: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8915 1726776631.59559: variable 'ansible_facts' from source: unknown 8915 1726776631.59628: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_stat.py 8915 1726776631.59713: Sending initial data 8915 1726776631.59720: Sent initial data (150 bytes) 8915 1726776631.62363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpwbkhqezj /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_stat.py <<< 8915 1726776631.63883: stderr chunk (state=3): >>><<< 8915 1726776631.63892: stdout chunk (state=3): >>><<< 8915 1726776631.63915: done transferring module to remote 8915 1726776631.63926: _low_level_execute_command(): starting 8915 1726776631.63937: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/ /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_stat.py && sleep 0' 8915 1726776631.67378: stderr chunk (state=2): >>><<< 8915 1726776631.67391: stdout chunk (state=2): >>><<< 8915 1726776631.67409: _low_level_execute_command() done: rc=0, stdout=, stderr= 8915 1726776631.67415: _low_level_execute_command(): starting 8915 1726776631.67421: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_stat.py && sleep 0' 8915 1726776631.84669: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726776624.3388984, "mtime": 1726776624.4688988, "ctime": 1726776624.4688988, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8915 1726776631.85887: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8915 1726776631.85932: stderr chunk (state=3): >>><<< 8915 1726776631.85938: stdout chunk (state=3): >>><<< 8915 1726776631.85951: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726776624.3388984, "mtime": 1726776624.4688988, "ctime": 1726776624.4688988, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 8915 1726776631.85996: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8915 1726776631.86077: Sending initial data 8915 1726776631.86085: Sent initial data (139 bytes) 8915 1726776631.88805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpbc2n65c_ /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source <<< 8915 1726776631.89163: stderr chunk (state=3): >>><<< 8915 1726776631.89170: stdout chunk (state=3): >>><<< 8915 1726776631.89189: _low_level_execute_command(): starting 8915 1726776631.89196: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/ /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source && sleep 0' 8915 1726776631.91545: stderr chunk (state=2): >>><<< 8915 1726776631.91554: stdout chunk (state=2): >>><<< 8915 1726776631.91566: _low_level_execute_command() done: rc=0, stdout=, stderr= 8915 1726776631.91588: variable 'ansible_module_compression' from source: unknown 8915 1726776631.91621: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8915 1726776631.91639: variable 'ansible_facts' from source: unknown 8915 1726776631.91698: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_copy.py 8915 1726776631.91780: Sending initial data 8915 1726776631.91787: Sent initial data (150 bytes) 8915 1726776631.94245: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpiljd6p8m /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_copy.py <<< 8915 1726776631.95355: stderr chunk (state=3): >>><<< 8915 1726776631.95364: stdout chunk (state=3): >>><<< 8915 1726776631.95385: done transferring module to remote 8915 1726776631.95394: _low_level_execute_command(): starting 8915 1726776631.95400: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/ /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_copy.py && sleep 0' 8915 1726776631.98199: stderr chunk (state=2): >>><<< 8915 1726776631.98211: stdout chunk (state=2): >>><<< 8915 1726776631.98227: _low_level_execute_command() done: rc=0, stdout=, stderr= 8915 1726776631.98233: _low_level_execute_command(): starting 8915 1726776631.98238: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/AnsiballZ_copy.py && sleep 0' 8915 1726776632.14935: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source", "_original_basename": "tmpbc2n65c_", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8915 1726776632.16100: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8915 1726776632.16113: stdout chunk (state=3): >>><<< 8915 1726776632.16124: stderr chunk (state=3): >>><<< 8915 1726776632.16141: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source", "_original_basename": "tmpbc2n65c_", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 8915 1726776632.16177: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source', '_original_basename': 'tmpbc2n65c_', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8915 1726776632.16190: _low_level_execute_command(): starting 8915 1726776632.16196: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/ > /dev/null 2>&1 && sleep 0' 8915 1726776632.18995: stderr chunk (state=2): >>><<< 8915 1726776632.19006: stdout chunk (state=2): >>><<< 8915 1726776632.19023: _low_level_execute_command() done: rc=0, stdout=, stderr= 8915 1726776632.19034: handler run complete 8915 1726776632.19062: attempt loop complete, returning result 8915 1726776632.19067: _execute() done 8915 1726776632.19070: dumping result to json 8915 1726776632.19076: done dumping result, returning 8915 1726776632.19084: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-cec2-986e-0000000000bd] 8915 1726776632.19092: sending task result for task 120fa90a-8a95-cec2-986e-0000000000bd 8915 1726776632.19136: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000bd 8915 1726776632.19140: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "src": "/root/.ansible/tmp/ansible-tmp-1726776631.567117-8915-164769422567605/source", "state": "file", "uid": 0 } 8218 1726776632.19554: no more pending results, returning what we have 8218 1726776632.19557: results queue empty 8218 1726776632.19558: checking for any_errors_fatal 8218 1726776632.19564: done checking for any_errors_fatal 8218 1726776632.19564: checking for max_fail_percentage 8218 1726776632.19566: done checking for max_fail_percentage 8218 1726776632.19566: checking to see if all hosts have failed and the running result is not ok 8218 1726776632.19567: done checking to see if all hosts have failed 8218 1726776632.19567: getting the remaining hosts for this loop 8218 1726776632.19568: done getting the remaining hosts for this loop 8218 1726776632.19571: getting the next task for host managed_node2 8218 1726776632.19577: done getting next task for host managed_node2 8218 1726776632.19580: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8218 1726776632.19582: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776632.19591: getting variables 8218 1726776632.19593: in VariableManager get_vars() 8218 1726776632.19624: Calling all_inventory to load vars for managed_node2 8218 1726776632.19627: Calling groups_inventory to load vars for managed_node2 8218 1726776632.19635: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776632.19644: Calling all_plugins_play to load vars for managed_node2 8218 1726776632.19647: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776632.19650: Calling groups_plugins_play to load vars for managed_node2 8218 1726776632.19814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776632.20011: done with get_vars() 8218 1726776632.20022: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.672) 0:00:18.031 **** 8218 1726776632.20101: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776632.20283: worker is 1 (out of 1 available) 8218 1726776632.20295: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776632.20306: done queuing things up, now waiting for results queue to drain 8218 1726776632.20307: waiting for pending results... 8964 1726776632.21399: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 8964 1726776632.21523: in run() - task 120fa90a-8a95-cec2-986e-0000000000be 8964 1726776632.21543: variable 'ansible_search_path' from source: unknown 8964 1726776632.21548: variable 'ansible_search_path' from source: unknown 8964 1726776632.21581: calling self._execute() 8964 1726776632.21661: variable 'ansible_host' from source: host vars for 'managed_node2' 8964 1726776632.21672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8964 1726776632.21682: variable 'omit' from source: magic vars 8964 1726776632.21782: variable 'omit' from source: magic vars 8964 1726776632.21827: variable 'omit' from source: magic vars 8964 1726776632.21856: variable '__kernel_settings_profile_filename' from source: role '' all vars 8964 1726776632.22127: variable '__kernel_settings_profile_filename' from source: role '' all vars 8964 1726776632.22211: variable '__kernel_settings_profile_dir' from source: role '' all vars 8964 1726776632.22962: variable '__kernel_settings_profile_parent' from source: set_fact 8964 1726776632.22971: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8964 1726776632.23013: variable 'omit' from source: magic vars 8964 1726776632.23056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8964 1726776632.23087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8964 1726776632.23109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8964 1726776632.23126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8964 1726776632.23142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8964 1726776632.23170: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8964 1726776632.23177: variable 'ansible_host' from source: host vars for 'managed_node2' 8964 1726776632.23181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8964 1726776632.23274: Set connection var ansible_connection to ssh 8964 1726776632.23283: Set connection var ansible_pipelining to False 8964 1726776632.23290: Set connection var ansible_timeout to 10 8964 1726776632.23298: Set connection var ansible_module_compression to ZIP_DEFLATED 8964 1726776632.23304: Set connection var ansible_shell_type to sh 8964 1726776632.23309: Set connection var ansible_shell_executable to /bin/sh 8964 1726776632.23330: variable 'ansible_shell_executable' from source: unknown 8964 1726776632.23337: variable 'ansible_connection' from source: unknown 8964 1726776632.23340: variable 'ansible_module_compression' from source: unknown 8964 1726776632.23343: variable 'ansible_shell_type' from source: unknown 8964 1726776632.23346: variable 'ansible_shell_executable' from source: unknown 8964 1726776632.23348: variable 'ansible_host' from source: host vars for 'managed_node2' 8964 1726776632.23352: variable 'ansible_pipelining' from source: unknown 8964 1726776632.23354: variable 'ansible_timeout' from source: unknown 8964 1726776632.23357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8964 1726776632.23523: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8964 1726776632.23537: variable 'omit' from source: magic vars 8964 1726776632.23544: starting attempt loop 8964 1726776632.23548: running the handler 8964 1726776632.23560: _low_level_execute_command(): starting 8964 1726776632.23568: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8964 1726776632.26152: stdout chunk (state=2): >>>/root <<< 8964 1726776632.26282: stderr chunk (state=3): >>><<< 8964 1726776632.26289: stdout chunk (state=3): >>><<< 8964 1726776632.26308: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8964 1726776632.26322: _low_level_execute_command(): starting 8964 1726776632.26330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205 `" && echo ansible-tmp-1726776632.2631636-8964-86544952441205="` echo /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205 `" ) && sleep 0' 8964 1726776632.32539: stdout chunk (state=2): >>>ansible-tmp-1726776632.2631636-8964-86544952441205=/root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205 <<< 8964 1726776632.32551: stderr chunk (state=2): >>><<< 8964 1726776632.32563: stdout chunk (state=3): >>><<< 8964 1726776632.32578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776632.2631636-8964-86544952441205=/root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205 , stderr= 8964 1726776632.32624: variable 'ansible_module_compression' from source: unknown 8964 1726776632.32669: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 8964 1726776632.32708: variable 'ansible_facts' from source: unknown 8964 1726776632.32805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205/AnsiballZ_kernel_settings_get_config.py 8964 1726776632.33705: Sending initial data 8964 1726776632.33712: Sent initial data (172 bytes) 8964 1726776632.37320: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp10m13eqh /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205/AnsiballZ_kernel_settings_get_config.py <<< 8964 1726776632.39493: stderr chunk (state=3): >>><<< 8964 1726776632.39504: stdout chunk (state=3): >>><<< 8964 1726776632.39526: done transferring module to remote 8964 1726776632.39540: _low_level_execute_command(): starting 8964 1726776632.39548: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205/ /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8964 1726776632.42455: stderr chunk (state=2): >>><<< 8964 1726776632.42465: stdout chunk (state=2): >>><<< 8964 1726776632.42484: _low_level_execute_command() done: rc=0, stdout=, stderr= 8964 1726776632.42489: _low_level_execute_command(): starting 8964 1726776632.42495: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8964 1726776632.58453: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530"}, "sysfs": {"/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0", "/sys/kernel/debug/x86/ibrs_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 8964 1726776632.59556: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 8964 1726776632.59606: stderr chunk (state=3): >>><<< 8964 1726776632.59613: stdout chunk (state=3): >>><<< 8964 1726776632.59633: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530"}, "sysfs": {"/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0", "/sys/kernel/debug/x86/ibrs_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 8964 1726776632.59660: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8964 1726776632.59672: _low_level_execute_command(): starting 8964 1726776632.59677: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776632.2631636-8964-86544952441205/ > /dev/null 2>&1 && sleep 0' 8964 1726776632.62141: stderr chunk (state=2): >>><<< 8964 1726776632.62149: stdout chunk (state=2): >>><<< 8964 1726776632.62162: _low_level_execute_command() done: rc=0, stdout=, stderr= 8964 1726776632.62169: handler run complete 8964 1726776632.62188: attempt loop complete, returning result 8964 1726776632.62193: _execute() done 8964 1726776632.62196: dumping result to json 8964 1726776632.62200: done dumping result, returning 8964 1726776632.62207: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-cec2-986e-0000000000be] 8964 1726776632.62213: sending task result for task 120fa90a-8a95-cec2-986e-0000000000be 8964 1726776632.62240: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000be 8964 1726776632.62242: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530" }, "sysfs": { "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8218 1726776632.62518: no more pending results, returning what we have 8218 1726776632.62520: results queue empty 8218 1726776632.62521: checking for any_errors_fatal 8218 1726776632.62526: done checking for any_errors_fatal 8218 1726776632.62527: checking for max_fail_percentage 8218 1726776632.62528: done checking for max_fail_percentage 8218 1726776632.62529: checking to see if all hosts have failed and the running result is not ok 8218 1726776632.62530: done checking to see if all hosts have failed 8218 1726776632.62530: getting the remaining hosts for this loop 8218 1726776632.62531: done getting the remaining hosts for this loop 8218 1726776632.62534: getting the next task for host managed_node2 8218 1726776632.62539: done getting next task for host managed_node2 8218 1726776632.62541: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8218 1726776632.62543: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776632.62549: getting variables 8218 1726776632.62550: in VariableManager get_vars() 8218 1726776632.62574: Calling all_inventory to load vars for managed_node2 8218 1726776632.62576: Calling groups_inventory to load vars for managed_node2 8218 1726776632.62577: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776632.62584: Calling all_plugins_play to load vars for managed_node2 8218 1726776632.62586: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776632.62587: Calling groups_plugins_play to load vars for managed_node2 8218 1726776632.62726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776632.62849: done with get_vars() 8218 1726776632.62857: done getting variables 8218 1726776632.62939: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:10:32 -0400 (0:00:00.428) 0:00:18.460 **** 8218 1726776632.62961: entering _queue_task() for managed_node2/template 8218 1726776632.62962: Creating lock for template 8218 1726776632.63133: worker is 1 (out of 1 available) 8218 1726776632.63146: exiting _queue_task() for managed_node2/template 8218 1726776632.63157: done queuing things up, now waiting for results queue to drain 8218 1726776632.63158: waiting for pending results... 9009 1726776632.63274: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 9009 1726776632.63375: in run() - task 120fa90a-8a95-cec2-986e-0000000000bf 9009 1726776632.63393: variable 'ansible_search_path' from source: unknown 9009 1726776632.63397: variable 'ansible_search_path' from source: unknown 9009 1726776632.63425: calling self._execute() 9009 1726776632.63496: variable 'ansible_host' from source: host vars for 'managed_node2' 9009 1726776632.63505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9009 1726776632.63513: variable 'omit' from source: magic vars 9009 1726776632.63585: variable 'omit' from source: magic vars 9009 1726776632.63619: variable 'omit' from source: magic vars 9009 1726776632.63847: variable '__kernel_settings_profile_src' from source: role '' all vars 9009 1726776632.63856: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9009 1726776632.63914: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9009 1726776632.63936: variable '__kernel_settings_profile_filename' from source: role '' all vars 9009 1726776632.63983: variable '__kernel_settings_profile_filename' from source: role '' all vars 9009 1726776632.64053: variable '__kernel_settings_profile_dir' from source: role '' all vars 9009 1726776632.64139: variable '__kernel_settings_profile_parent' from source: set_fact 9009 1726776632.64148: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9009 1726776632.64176: variable 'omit' from source: magic vars 9009 1726776632.64219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9009 1726776632.64254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9009 1726776632.64277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9009 1726776632.64297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9009 1726776632.64310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9009 1726776632.64339: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9009 1726776632.64345: variable 'ansible_host' from source: host vars for 'managed_node2' 9009 1726776632.64349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9009 1726776632.64442: Set connection var ansible_connection to ssh 9009 1726776632.64451: Set connection var ansible_pipelining to False 9009 1726776632.64457: Set connection var ansible_timeout to 10 9009 1726776632.64465: Set connection var ansible_module_compression to ZIP_DEFLATED 9009 1726776632.64470: Set connection var ansible_shell_type to sh 9009 1726776632.64476: Set connection var ansible_shell_executable to /bin/sh 9009 1726776632.64499: variable 'ansible_shell_executable' from source: unknown 9009 1726776632.64505: variable 'ansible_connection' from source: unknown 9009 1726776632.64508: variable 'ansible_module_compression' from source: unknown 9009 1726776632.64511: variable 'ansible_shell_type' from source: unknown 9009 1726776632.64514: variable 'ansible_shell_executable' from source: unknown 9009 1726776632.64516: variable 'ansible_host' from source: host vars for 'managed_node2' 9009 1726776632.64519: variable 'ansible_pipelining' from source: unknown 9009 1726776632.64522: variable 'ansible_timeout' from source: unknown 9009 1726776632.64526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9009 1726776632.64655: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9009 1726776632.64667: variable 'omit' from source: magic vars 9009 1726776632.64673: starting attempt loop 9009 1726776632.64676: running the handler 9009 1726776632.64690: _low_level_execute_command(): starting 9009 1726776632.64697: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9009 1726776632.67152: stdout chunk (state=2): >>>/root <<< 9009 1726776632.67273: stderr chunk (state=3): >>><<< 9009 1726776632.67279: stdout chunk (state=3): >>><<< 9009 1726776632.67299: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9009 1726776632.67311: _low_level_execute_command(): starting 9009 1726776632.67316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694 `" && echo ansible-tmp-1726776632.6730611-9009-95546481590694="` echo /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694 `" ) && sleep 0' 9009 1726776632.70031: stdout chunk (state=2): >>>ansible-tmp-1726776632.6730611-9009-95546481590694=/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694 <<< 9009 1726776632.70169: stderr chunk (state=3): >>><<< 9009 1726776632.70176: stdout chunk (state=3): >>><<< 9009 1726776632.70191: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776632.6730611-9009-95546481590694=/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694 , stderr= 9009 1726776632.70206: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 9009 1726776632.70224: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 9009 1726776632.70251: variable 'ansible_search_path' from source: unknown 9009 1726776632.70845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9009 1726776632.72336: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9009 1726776632.72667: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9009 1726776632.72707: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9009 1726776632.72742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9009 1726776632.72767: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9009 1726776632.73070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9009 1726776632.73103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9009 1726776632.73131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9009 1726776632.73172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9009 1726776632.73189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9009 1726776632.73560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9009 1726776632.73587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9009 1726776632.73611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9009 1726776632.73651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9009 1726776632.73665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9009 1726776632.74764: variable 'ansible_managed' from source: unknown 9009 1726776632.74773: variable '__sections' from source: task vars 9009 1726776632.74909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9009 1726776632.74936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9009 1726776632.74960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9009 1726776632.74999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9009 1726776632.75013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9009 1726776632.75122: variable 'kernel_settings_sysctl' from source: include params 9009 1726776632.75135: variable '__kernel_settings_state_empty' from source: role '' all vars 9009 1726776632.75142: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9009 1726776632.75195: variable '__sysctl_old' from source: task vars 9009 1726776632.75259: variable '__sysctl_old' from source: task vars 9009 1726776632.75484: variable 'kernel_settings_purge' from source: role '' defaults 9009 1726776632.75492: variable 'kernel_settings_sysctl' from source: include params 9009 1726776632.75500: variable '__kernel_settings_state_empty' from source: role '' all vars 9009 1726776632.75505: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9009 1726776632.75510: variable '__kernel_settings_profile_contents' from source: set_fact 9009 1726776632.75727: variable 'kernel_settings_sysfs' from source: include params 9009 1726776632.75739: variable '__kernel_settings_state_empty' from source: role '' all vars 9009 1726776632.75745: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9009 1726776632.75767: variable '__sysfs_old' from source: task vars 9009 1726776632.75827: variable '__sysfs_old' from source: task vars 9009 1726776632.76090: variable 'kernel_settings_purge' from source: role '' defaults 9009 1726776632.76097: variable 'kernel_settings_sysfs' from source: include params 9009 1726776632.76104: variable '__kernel_settings_state_empty' from source: role '' all vars 9009 1726776632.76109: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9009 1726776632.76114: variable '__kernel_settings_profile_contents' from source: set_fact 9009 1726776632.76162: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 9009 1726776632.76172: variable '__systemd_old' from source: task vars 9009 1726776632.76232: variable '__systemd_old' from source: task vars 9009 1726776632.76437: variable 'kernel_settings_purge' from source: role '' defaults 9009 1726776632.76444: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 9009 1726776632.76449: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.76454: variable '__kernel_settings_profile_contents' from source: set_fact 9009 1726776632.76469: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 9009 1726776632.76474: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 9009 1726776632.76479: variable '__trans_huge_old' from source: task vars 9009 1726776632.76539: variable '__trans_huge_old' from source: task vars 9009 1726776632.76743: variable 'kernel_settings_purge' from source: role '' defaults 9009 1726776632.76750: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 9009 1726776632.76755: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.76760: variable '__kernel_settings_profile_contents' from source: set_fact 9009 1726776632.76772: variable '__trans_defrag_old' from source: task vars 9009 1726776632.76832: variable '__trans_defrag_old' from source: task vars 9009 1726776632.77095: variable 'kernel_settings_purge' from source: role '' defaults 9009 1726776632.77102: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 9009 1726776632.77106: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77112: variable '__kernel_settings_profile_contents' from source: set_fact 9009 1726776632.77132: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77145: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77159: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77175: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77186: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77193: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77210: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77218: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77228: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77235: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77244: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77252: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77258: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77264: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.77270: variable '__kernel_settings_state_absent' from source: role '' all vars 9009 1726776632.78074: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9009 1726776632.78133: variable 'ansible_module_compression' from source: unknown 9009 1726776632.78190: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9009 1726776632.78216: variable 'ansible_facts' from source: unknown 9009 1726776632.78300: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_stat.py 9009 1726776632.78846: Sending initial data 9009 1726776632.78854: Sent initial data (150 bytes) 9009 1726776632.81459: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp3h0t_71l /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_stat.py <<< 9009 1726776632.82846: stderr chunk (state=3): >>><<< 9009 1726776632.82857: stdout chunk (state=3): >>><<< 9009 1726776632.82881: done transferring module to remote 9009 1726776632.82894: _low_level_execute_command(): starting 9009 1726776632.82899: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/ /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_stat.py && sleep 0' 9009 1726776632.86036: stderr chunk (state=2): >>><<< 9009 1726776632.86046: stdout chunk (state=2): >>><<< 9009 1726776632.86061: _low_level_execute_command() done: rc=0, stdout=, stderr= 9009 1726776632.86066: _low_level_execute_command(): starting 9009 1726776632.86071: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_stat.py && sleep 0' 9009 1726776633.02535: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 381, "inode": 473956483, "dev": 51713, "nlink": 1, "atime": 1726776632.582925, "mtime": 1726776623.0698943, "ctime": 1726776623.4238954, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "mimetype": "text/plain", "charset": "us-ascii", "version": "1830285242", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9009 1726776633.03692: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9009 1726776633.03737: stderr chunk (state=3): >>><<< 9009 1726776633.03743: stdout chunk (state=3): >>><<< 9009 1726776633.03758: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 381, "inode": 473956483, "dev": 51713, "nlink": 1, "atime": 1726776632.582925, "mtime": 1726776623.0698943, "ctime": 1726776623.4238954, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "mimetype": "text/plain", "charset": "us-ascii", "version": "1830285242", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 9009 1726776633.03797: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9009 1726776633.03878: Sending initial data 9009 1726776633.03886: Sent initial data (158 bytes) 9009 1726776633.06440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpex8ow1ha/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source <<< 9009 1726776633.06787: stderr chunk (state=3): >>><<< 9009 1726776633.06794: stdout chunk (state=3): >>><<< 9009 1726776633.06807: _low_level_execute_command(): starting 9009 1726776633.06813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/ /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source && sleep 0' 9009 1726776633.09216: stderr chunk (state=2): >>><<< 9009 1726776633.09225: stdout chunk (state=2): >>><<< 9009 1726776633.09243: _low_level_execute_command() done: rc=0, stdout=, stderr= 9009 1726776633.09264: variable 'ansible_module_compression' from source: unknown 9009 1726776633.09297: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 9009 1726776633.09315: variable 'ansible_facts' from source: unknown 9009 1726776633.09375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_copy.py 9009 1726776633.09461: Sending initial data 9009 1726776633.09468: Sent initial data (150 bytes) 9009 1726776633.12061: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpw8nmvfqp /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_copy.py <<< 9009 1726776633.13207: stderr chunk (state=3): >>><<< 9009 1726776633.13213: stdout chunk (state=3): >>><<< 9009 1726776633.13231: done transferring module to remote 9009 1726776633.13239: _low_level_execute_command(): starting 9009 1726776633.13244: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/ /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_copy.py && sleep 0' 9009 1726776633.15626: stderr chunk (state=2): >>><<< 9009 1726776633.15634: stdout chunk (state=2): >>><<< 9009 1726776633.15647: _low_level_execute_command() done: rc=0, stdout=, stderr= 9009 1726776633.15651: _low_level_execute_command(): starting 9009 1726776633.15656: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/AnsiballZ_copy.py && sleep 0' 9009 1726776633.32230: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9009 1726776633.33418: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9009 1726776633.33430: stdout chunk (state=3): >>><<< 9009 1726776633.33450: stderr chunk (state=3): >>><<< 9009 1726776633.33465: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 9009 1726776633.33502: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '3feaf86b2638623e3300792e683ce55f91f31e9a', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9009 1726776633.33537: _low_level_execute_command(): starting 9009 1726776633.33545: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/ > /dev/null 2>&1 && sleep 0' 9009 1726776633.36635: stderr chunk (state=2): >>><<< 9009 1726776633.36646: stdout chunk (state=2): >>><<< 9009 1726776633.36665: _low_level_execute_command() done: rc=0, stdout=, stderr= 9009 1726776633.36676: handler run complete 9009 1726776633.36708: attempt loop complete, returning result 9009 1726776633.36714: _execute() done 9009 1726776633.36717: dumping result to json 9009 1726776633.36725: done dumping result, returning 9009 1726776633.36737: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-cec2-986e-0000000000bf] 9009 1726776633.36743: sending task result for task 120fa90a-8a95-cec2-986e-0000000000bf 9009 1726776633.36810: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000bf 9009 1726776633.36815: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "src": "/root/.ansible/tmp/ansible-tmp-1726776632.6730611-9009-95546481590694/source", "state": "file", "uid": 0 } 8218 1726776633.37222: no more pending results, returning what we have 8218 1726776633.37225: results queue empty 8218 1726776633.37226: checking for any_errors_fatal 8218 1726776633.37234: done checking for any_errors_fatal 8218 1726776633.37234: checking for max_fail_percentage 8218 1726776633.37236: done checking for max_fail_percentage 8218 1726776633.37236: checking to see if all hosts have failed and the running result is not ok 8218 1726776633.37237: done checking to see if all hosts have failed 8218 1726776633.37238: getting the remaining hosts for this loop 8218 1726776633.37239: done getting the remaining hosts for this loop 8218 1726776633.37243: getting the next task for host managed_node2 8218 1726776633.37249: done getting next task for host managed_node2 8218 1726776633.37252: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8218 1726776633.37257: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776633.37270: getting variables 8218 1726776633.37271: in VariableManager get_vars() 8218 1726776633.37305: Calling all_inventory to load vars for managed_node2 8218 1726776633.37308: Calling groups_inventory to load vars for managed_node2 8218 1726776633.37310: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776633.37318: Calling all_plugins_play to load vars for managed_node2 8218 1726776633.37321: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776633.37323: Calling groups_plugins_play to load vars for managed_node2 8218 1726776633.37493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776633.37731: done with get_vars() 8218 1726776633.37742: done getting variables 8218 1726776633.37799: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:10:33 -0400 (0:00:00.748) 0:00:19.208 **** 8218 1726776633.37830: entering _queue_task() for managed_node2/service 8218 1726776633.38022: worker is 1 (out of 1 available) 8218 1726776633.38035: exiting _queue_task() for managed_node2/service 8218 1726776633.38045: done queuing things up, now waiting for results queue to drain 8218 1726776633.38047: waiting for pending results... 9063 1726776633.38257: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 9063 1726776633.38389: in run() - task 120fa90a-8a95-cec2-986e-0000000000c0 9063 1726776633.38405: variable 'ansible_search_path' from source: unknown 9063 1726776633.38409: variable 'ansible_search_path' from source: unknown 9063 1726776633.38450: variable '__kernel_settings_services' from source: include_vars 9063 1726776633.38744: variable '__kernel_settings_services' from source: include_vars 9063 1726776633.38812: variable 'omit' from source: magic vars 9063 1726776633.38921: variable 'ansible_host' from source: host vars for 'managed_node2' 9063 1726776633.38937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9063 1726776633.38947: variable 'omit' from source: magic vars 9063 1726776633.39236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9063 1726776633.39465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9063 1726776633.39515: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9063 1726776633.39551: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9063 1726776633.39584: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9063 1726776633.39680: variable '__kernel_settings_register_profile' from source: set_fact 9063 1726776633.39695: variable '__kernel_settings_register_mode' from source: set_fact 9063 1726776633.39713: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): True 9063 1726776633.39720: variable 'omit' from source: magic vars 9063 1726776633.39761: variable 'omit' from source: magic vars 9063 1726776633.39805: variable 'item' from source: unknown 9063 1726776633.39892: variable 'item' from source: unknown 9063 1726776633.39913: variable 'omit' from source: magic vars 9063 1726776633.39940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9063 1726776633.39967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9063 1726776633.39990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9063 1726776633.40007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9063 1726776633.40018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9063 1726776633.40047: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9063 1726776633.40054: variable 'ansible_host' from source: host vars for 'managed_node2' 9063 1726776633.40058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9063 1726776633.40156: Set connection var ansible_connection to ssh 9063 1726776633.40165: Set connection var ansible_pipelining to False 9063 1726776633.40172: Set connection var ansible_timeout to 10 9063 1726776633.40180: Set connection var ansible_module_compression to ZIP_DEFLATED 9063 1726776633.40189: Set connection var ansible_shell_type to sh 9063 1726776633.40195: Set connection var ansible_shell_executable to /bin/sh 9063 1726776633.40212: variable 'ansible_shell_executable' from source: unknown 9063 1726776633.40217: variable 'ansible_connection' from source: unknown 9063 1726776633.40220: variable 'ansible_module_compression' from source: unknown 9063 1726776633.40223: variable 'ansible_shell_type' from source: unknown 9063 1726776633.40226: variable 'ansible_shell_executable' from source: unknown 9063 1726776633.40299: variable 'ansible_host' from source: host vars for 'managed_node2' 9063 1726776633.40305: variable 'ansible_pipelining' from source: unknown 9063 1726776633.40309: variable 'ansible_timeout' from source: unknown 9063 1726776633.40313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9063 1726776633.40419: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9063 1726776633.40433: variable 'omit' from source: magic vars 9063 1726776633.40440: starting attempt loop 9063 1726776633.40443: running the handler 9063 1726776633.40519: variable 'ansible_facts' from source: unknown 9063 1726776633.40641: _low_level_execute_command(): starting 9063 1726776633.40650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9063 1726776633.43939: stdout chunk (state=2): >>>/root <<< 9063 1726776633.44014: stderr chunk (state=3): >>><<< 9063 1726776633.44022: stdout chunk (state=3): >>><<< 9063 1726776633.44047: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9063 1726776633.44069: _low_level_execute_command(): starting 9063 1726776633.44078: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198 `" && echo ansible-tmp-1726776633.4406009-9063-72088145277198="` echo /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198 `" ) && sleep 0' 9063 1726776633.48643: stdout chunk (state=2): >>>ansible-tmp-1726776633.4406009-9063-72088145277198=/root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198 <<< 9063 1726776633.48742: stderr chunk (state=3): >>><<< 9063 1726776633.48752: stdout chunk (state=3): >>><<< 9063 1726776633.48773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776633.4406009-9063-72088145277198=/root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198 , stderr= 9063 1726776633.48810: variable 'ansible_module_compression' from source: unknown 9063 1726776633.48870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 9063 1726776633.48931: variable 'ansible_facts' from source: unknown 9063 1726776633.49192: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198/AnsiballZ_systemd.py 9063 1726776633.50886: Sending initial data 9063 1726776633.50895: Sent initial data (153 bytes) 9063 1726776633.55506: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpru3csu5a /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198/AnsiballZ_systemd.py <<< 9063 1726776633.58974: stderr chunk (state=3): >>><<< 9063 1726776633.58987: stdout chunk (state=3): >>><<< 9063 1726776633.59014: done transferring module to remote 9063 1726776633.59035: _low_level_execute_command(): starting 9063 1726776633.59042: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198/ /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198/AnsiballZ_systemd.py && sleep 0' 9063 1726776633.63626: stderr chunk (state=2): >>><<< 9063 1726776633.63639: stdout chunk (state=2): >>><<< 9063 1726776633.63655: _low_level_execute_command() done: rc=0, stdout=, stderr= 9063 1726776633.63660: _low_level_execute_command(): starting 9063 1726776633.63666: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198/AnsiballZ_systemd.py && sleep 0' 9063 1726776634.19898: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:24 EDT", "WatchdogTimestampMonotonic": "235500210", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8410", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ExecMainStartTimestampMonotonic": "235251649", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8410", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:24 EDT] ; stop_time=[n/a] ; pid=8410 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15007744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHi<<< 9063 1726776634.19923: stdout chunk (state=3): >>>gh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:24 EDT", "StateChangeTimestampMonotonic": "235500213", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveExitTimestampMonotonic": "235251858", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveEnterTimestampMonotonic": "235500213", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveExitTimestampMonotonic": "235180357", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveEnterTimestampMonotonic": "235248719", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ConditionTimestampMonotonic": "235250552", "AssertTimestamp": "Thu 2024-09-19 16:10:24 EDT", "AssertTimestampMonotonic": "235250553", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "222bf69699fd489abd35224c3eab4032", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 9063 1726776634.21659: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9063 1726776634.21707: stderr chunk (state=3): >>><<< 9063 1726776634.21713: stdout chunk (state=3): >>><<< 9063 1726776634.21734: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:24 EDT", "WatchdogTimestampMonotonic": "235500210", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8410", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ExecMainStartTimestampMonotonic": "235251649", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8410", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:24 EDT] ; stop_time=[n/a] ; pid=8410 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15007744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:24 EDT", "StateChangeTimestampMonotonic": "235500213", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveExitTimestampMonotonic": "235251858", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveEnterTimestampMonotonic": "235500213", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveExitTimestampMonotonic": "235180357", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveEnterTimestampMonotonic": "235248719", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ConditionTimestampMonotonic": "235250552", "AssertTimestamp": "Thu 2024-09-19 16:10:24 EDT", "AssertTimestampMonotonic": "235250553", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "222bf69699fd489abd35224c3eab4032", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 9063 1726776634.21836: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9063 1726776634.21854: _low_level_execute_command(): starting 9063 1726776634.21861: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776633.4406009-9063-72088145277198/ > /dev/null 2>&1 && sleep 0' 9063 1726776634.24452: stderr chunk (state=2): >>><<< 9063 1726776634.24460: stdout chunk (state=2): >>><<< 9063 1726776634.24475: _low_level_execute_command() done: rc=0, stdout=, stderr= 9063 1726776634.24484: handler run complete 9063 1726776634.24516: attempt loop complete, returning result 9063 1726776634.24535: variable 'item' from source: unknown 9063 1726776634.24609: variable 'item' from source: unknown changed: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveEnterTimestampMonotonic": "235500213", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ActiveExitTimestampMonotonic": "235180357", "ActiveState": "active", "After": "polkit.service system.slice sysinit.target dbus.socket basic.target systemd-sysctl.service systemd-journald.socket dbus.service network.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:24 EDT", "AssertTimestampMonotonic": "235250553", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ConditionTimestampMonotonic": "235250552", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service cpupower.service power-profiles-daemon.service tlp.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8410", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:24 EDT", "ExecMainStartTimestampMonotonic": "235251649", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:24 EDT] ; stop_time=[n/a] ; pid=8410 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveEnterTimestampMonotonic": "235248719", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:24 EDT", "InactiveExitTimestampMonotonic": "235251858", "InvocationID": "222bf69699fd489abd35224c3eab4032", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8410", "MemoryAccounting": "yes", "MemoryCurrent": "15007744", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice sysinit.target dbus.service dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:24 EDT", "StateChangeTimestampMonotonic": "235500213", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:24 EDT", "WatchdogTimestampMonotonic": "235500210", "WatchdogUSec": "0" } } 9063 1726776634.24702: dumping result to json 9063 1726776634.24720: done dumping result, returning 9063 1726776634.24731: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-cec2-986e-0000000000c0] 9063 1726776634.24738: sending task result for task 120fa90a-8a95-cec2-986e-0000000000c0 9063 1726776634.24841: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000c0 9063 1726776634.24846: WORKER PROCESS EXITING 8218 1726776634.25144: no more pending results, returning what we have 8218 1726776634.25147: results queue empty 8218 1726776634.25147: checking for any_errors_fatal 8218 1726776634.25154: done checking for any_errors_fatal 8218 1726776634.25154: checking for max_fail_percentage 8218 1726776634.25155: done checking for max_fail_percentage 8218 1726776634.25156: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.25156: done checking to see if all hosts have failed 8218 1726776634.25157: getting the remaining hosts for this loop 8218 1726776634.25157: done getting the remaining hosts for this loop 8218 1726776634.25159: getting the next task for host managed_node2 8218 1726776634.25163: done getting next task for host managed_node2 8218 1726776634.25165: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8218 1726776634.25167: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.25172: getting variables 8218 1726776634.25173: in VariableManager get_vars() 8218 1726776634.25197: Calling all_inventory to load vars for managed_node2 8218 1726776634.25200: Calling groups_inventory to load vars for managed_node2 8218 1726776634.25201: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.25208: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.25209: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.25211: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.25312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.25448: done with get_vars() 8218 1726776634.25455: done getting variables 8218 1726776634.25517: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.877) 0:00:20.085 **** 8218 1726776634.25542: entering _queue_task() for managed_node2/command 8218 1726776634.25543: Creating lock for command 8218 1726776634.25697: worker is 1 (out of 1 available) 8218 1726776634.25711: exiting _queue_task() for managed_node2/command 8218 1726776634.25722: done queuing things up, now waiting for results queue to drain 8218 1726776634.25724: waiting for pending results... 9087 1726776634.25845: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 9087 1726776634.25947: in run() - task 120fa90a-8a95-cec2-986e-0000000000c1 9087 1726776634.25963: variable 'ansible_search_path' from source: unknown 9087 1726776634.25967: variable 'ansible_search_path' from source: unknown 9087 1726776634.25996: calling self._execute() 9087 1726776634.26056: variable 'ansible_host' from source: host vars for 'managed_node2' 9087 1726776634.26065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9087 1726776634.26073: variable 'omit' from source: magic vars 9087 1726776634.26402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9087 1726776634.26578: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9087 1726776634.26613: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9087 1726776634.26640: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9087 1726776634.26666: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9087 1726776634.26751: variable '__kernel_settings_register_profile' from source: set_fact 9087 1726776634.26771: Evaluated conditional (not __kernel_settings_register_profile is changed): False 9087 1726776634.26776: when evaluation is False, skipping this task 9087 1726776634.26779: _execute() done 9087 1726776634.26785: dumping result to json 9087 1726776634.26790: done dumping result, returning 9087 1726776634.26795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-cec2-986e-0000000000c1] 9087 1726776634.26802: sending task result for task 120fa90a-8a95-cec2-986e-0000000000c1 9087 1726776634.26824: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000c1 9087 1726776634.26827: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_register_profile is changed", "skip_reason": "Conditional result was False" } 8218 1726776634.26951: no more pending results, returning what we have 8218 1726776634.26954: results queue empty 8218 1726776634.26954: checking for any_errors_fatal 8218 1726776634.26968: done checking for any_errors_fatal 8218 1726776634.26969: checking for max_fail_percentage 8218 1726776634.26970: done checking for max_fail_percentage 8218 1726776634.26970: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.26971: done checking to see if all hosts have failed 8218 1726776634.26972: getting the remaining hosts for this loop 8218 1726776634.26972: done getting the remaining hosts for this loop 8218 1726776634.26975: getting the next task for host managed_node2 8218 1726776634.26980: done getting next task for host managed_node2 8218 1726776634.26983: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8218 1726776634.26985: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.26996: getting variables 8218 1726776634.26997: in VariableManager get_vars() 8218 1726776634.27021: Calling all_inventory to load vars for managed_node2 8218 1726776634.27022: Calling groups_inventory to load vars for managed_node2 8218 1726776634.27024: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.27031: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.27033: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.27035: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.27133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.27251: done with get_vars() 8218 1726776634.27257: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.017) 0:00:20.103 **** 8218 1726776634.27317: entering _queue_task() for managed_node2/include_tasks 8218 1726776634.27462: worker is 1 (out of 1 available) 8218 1726776634.27474: exiting _queue_task() for managed_node2/include_tasks 8218 1726776634.27485: done queuing things up, now waiting for results queue to drain 8218 1726776634.27486: waiting for pending results... 9088 1726776634.27598: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 9088 1726776634.27701: in run() - task 120fa90a-8a95-cec2-986e-0000000000c2 9088 1726776634.27715: variable 'ansible_search_path' from source: unknown 9088 1726776634.27720: variable 'ansible_search_path' from source: unknown 9088 1726776634.27748: calling self._execute() 9088 1726776634.27805: variable 'ansible_host' from source: host vars for 'managed_node2' 9088 1726776634.27814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9088 1726776634.27822: variable 'omit' from source: magic vars 9088 1726776634.28127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9088 1726776634.28297: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9088 1726776634.28374: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9088 1726776634.28402: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9088 1726776634.28428: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9088 1726776634.28508: variable '__kernel_settings_register_apply' from source: set_fact 9088 1726776634.28531: Evaluated conditional (__kernel_settings_register_apply is changed): True 9088 1726776634.28539: _execute() done 9088 1726776634.28543: dumping result to json 9088 1726776634.28547: done dumping result, returning 9088 1726776634.28553: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-cec2-986e-0000000000c2] 9088 1726776634.28559: sending task result for task 120fa90a-8a95-cec2-986e-0000000000c2 9088 1726776634.28580: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000c2 9088 1726776634.28586: WORKER PROCESS EXITING 8218 1726776634.28680: no more pending results, returning what we have 8218 1726776634.28684: in VariableManager get_vars() 8218 1726776634.28716: Calling all_inventory to load vars for managed_node2 8218 1726776634.28719: Calling groups_inventory to load vars for managed_node2 8218 1726776634.28720: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.28730: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.28732: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.28735: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.28839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.28983: done with get_vars() 8218 1726776634.28987: variable 'ansible_search_path' from source: unknown 8218 1726776634.28988: variable 'ansible_search_path' from source: unknown 8218 1726776634.29010: we have included files to process 8218 1726776634.29011: generating all_blocks data 8218 1726776634.29011: done generating all_blocks data 8218 1726776634.29016: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776634.29017: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776634.29018: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8218 1726776634.29314: done processing included file 8218 1726776634.29315: iterating over new_blocks loaded from include file 8218 1726776634.29316: in VariableManager get_vars() 8218 1726776634.29332: done with get_vars() 8218 1726776634.29333: filtering new block on tags 8218 1726776634.29363: done filtering new block on tags 8218 1726776634.29365: done iterating over new_blocks loaded from include file 8218 1726776634.29365: extending task lists for all hosts with included blocks 8218 1726776634.29719: done extending task lists 8218 1726776634.29720: done processing included files 8218 1726776634.29721: results queue empty 8218 1726776634.29721: checking for any_errors_fatal 8218 1726776634.29723: done checking for any_errors_fatal 8218 1726776634.29724: checking for max_fail_percentage 8218 1726776634.29724: done checking for max_fail_percentage 8218 1726776634.29725: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.29725: done checking to see if all hosts have failed 8218 1726776634.29726: getting the remaining hosts for this loop 8218 1726776634.29726: done getting the remaining hosts for this loop 8218 1726776634.29728: getting the next task for host managed_node2 8218 1726776634.29732: done getting next task for host managed_node2 8218 1726776634.29734: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8218 1726776634.29735: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.29742: getting variables 8218 1726776634.29743: in VariableManager get_vars() 8218 1726776634.29751: Calling all_inventory to load vars for managed_node2 8218 1726776634.29752: Calling groups_inventory to load vars for managed_node2 8218 1726776634.29753: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.29756: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.29757: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.29759: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.29980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.30092: done with get_vars() 8218 1726776634.30099: done getting variables 8218 1726776634.30122: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.028) 0:00:20.132 **** 8218 1726776634.30144: entering _queue_task() for managed_node2/command 8218 1726776634.30291: worker is 1 (out of 1 available) 8218 1726776634.30303: exiting _queue_task() for managed_node2/command 8218 1726776634.30315: done queuing things up, now waiting for results queue to drain 8218 1726776634.30317: waiting for pending results... 9089 1726776634.30437: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 9089 1726776634.30551: in run() - task 120fa90a-8a95-cec2-986e-0000000001c9 9089 1726776634.30567: variable 'ansible_search_path' from source: unknown 9089 1726776634.30571: variable 'ansible_search_path' from source: unknown 9089 1726776634.30599: calling self._execute() 9089 1726776634.30658: variable 'ansible_host' from source: host vars for 'managed_node2' 9089 1726776634.30667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9089 1726776634.30676: variable 'omit' from source: magic vars 9089 1726776634.30745: variable 'omit' from source: magic vars 9089 1726776634.30792: variable 'omit' from source: magic vars 9089 1726776634.30815: variable 'omit' from source: magic vars 9089 1726776634.30847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9089 1726776634.30874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9089 1726776634.30895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9089 1726776634.30909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9089 1726776634.30920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9089 1726776634.30945: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9089 1726776634.30950: variable 'ansible_host' from source: host vars for 'managed_node2' 9089 1726776634.30955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9089 1726776634.31024: Set connection var ansible_connection to ssh 9089 1726776634.31033: Set connection var ansible_pipelining to False 9089 1726776634.31040: Set connection var ansible_timeout to 10 9089 1726776634.31048: Set connection var ansible_module_compression to ZIP_DEFLATED 9089 1726776634.31053: Set connection var ansible_shell_type to sh 9089 1726776634.31058: Set connection var ansible_shell_executable to /bin/sh 9089 1726776634.31073: variable 'ansible_shell_executable' from source: unknown 9089 1726776634.31078: variable 'ansible_connection' from source: unknown 9089 1726776634.31081: variable 'ansible_module_compression' from source: unknown 9089 1726776634.31087: variable 'ansible_shell_type' from source: unknown 9089 1726776634.31090: variable 'ansible_shell_executable' from source: unknown 9089 1726776634.31094: variable 'ansible_host' from source: host vars for 'managed_node2' 9089 1726776634.31097: variable 'ansible_pipelining' from source: unknown 9089 1726776634.31099: variable 'ansible_timeout' from source: unknown 9089 1726776634.31101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9089 1726776634.31185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9089 1726776634.31196: variable 'omit' from source: magic vars 9089 1726776634.31200: starting attempt loop 9089 1726776634.31203: running the handler 9089 1726776634.31214: _low_level_execute_command(): starting 9089 1726776634.31221: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9089 1726776634.33579: stdout chunk (state=2): >>>/root <<< 9089 1726776634.33697: stderr chunk (state=3): >>><<< 9089 1726776634.33703: stdout chunk (state=3): >>><<< 9089 1726776634.33719: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9089 1726776634.33733: _low_level_execute_command(): starting 9089 1726776634.33740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680 `" && echo ansible-tmp-1726776634.3372629-9089-63976490156680="` echo /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680 `" ) && sleep 0' 9089 1726776634.36213: stdout chunk (state=2): >>>ansible-tmp-1726776634.3372629-9089-63976490156680=/root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680 <<< 9089 1726776634.36341: stderr chunk (state=3): >>><<< 9089 1726776634.36349: stdout chunk (state=3): >>><<< 9089 1726776634.36362: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776634.3372629-9089-63976490156680=/root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680 , stderr= 9089 1726776634.36387: variable 'ansible_module_compression' from source: unknown 9089 1726776634.36432: ANSIBALLZ: Using generic lock for ansible.legacy.command 9089 1726776634.36437: ANSIBALLZ: Acquiring lock 9089 1726776634.36440: ANSIBALLZ: Lock acquired: 140571206407024 9089 1726776634.36445: ANSIBALLZ: Creating module 9089 1726776634.45471: ANSIBALLZ: Writing module into payload 9089 1726776634.45552: ANSIBALLZ: Writing module 9089 1726776634.45571: ANSIBALLZ: Renaming module 9089 1726776634.45578: ANSIBALLZ: Done creating module 9089 1726776634.45593: variable 'ansible_facts' from source: unknown 9089 1726776634.45650: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680/AnsiballZ_command.py 9089 1726776634.45747: Sending initial data 9089 1726776634.45753: Sent initial data (153 bytes) 9089 1726776634.48439: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpoud_0hmc /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680/AnsiballZ_command.py <<< 9089 1726776634.49510: stderr chunk (state=3): >>><<< 9089 1726776634.49517: stdout chunk (state=3): >>><<< 9089 1726776634.49539: done transferring module to remote 9089 1726776634.49550: _low_level_execute_command(): starting 9089 1726776634.49556: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680/ /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680/AnsiballZ_command.py && sleep 0' 9089 1726776634.51970: stderr chunk (state=2): >>><<< 9089 1726776634.51977: stdout chunk (state=2): >>><<< 9089 1726776634.51992: _low_level_execute_command() done: rc=0, stdout=, stderr= 9089 1726776634.51996: _low_level_execute_command(): starting 9089 1726776634.52001: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680/AnsiballZ_command.py && sleep 0' 9089 1726776634.82941: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:10:34.677146", "end": "2024-09-19 16:10:34.828502", "delta": "0:00:00.151356", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9089 1726776634.84514: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9089 1726776634.84525: stdout chunk (state=3): >>><<< 9089 1726776634.84540: stderr chunk (state=3): >>><<< 9089 1726776634.84556: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:10:34.677146", "end": "2024-09-19 16:10:34.828502", "delta": "0:00:00.151356", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 9089 1726776634.84607: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9089 1726776634.84618: _low_level_execute_command(): starting 9089 1726776634.84623: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776634.3372629-9089-63976490156680/ > /dev/null 2>&1 && sleep 0' 9089 1726776634.87989: stderr chunk (state=2): >>><<< 9089 1726776634.87997: stdout chunk (state=2): >>><<< 9089 1726776634.88013: _low_level_execute_command() done: rc=0, stdout=, stderr= 9089 1726776634.88022: handler run complete 9089 1726776634.88047: Evaluated conditional (False): False 9089 1726776634.88059: attempt loop complete, returning result 9089 1726776634.88064: _execute() done 9089 1726776634.88067: dumping result to json 9089 1726776634.88072: done dumping result, returning 9089 1726776634.88080: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [120fa90a-8a95-cec2-986e-0000000001c9] 9089 1726776634.88089: sending task result for task 120fa90a-8a95-cec2-986e-0000000001c9 9089 1726776634.88131: done sending task result for task 120fa90a-8a95-cec2-986e-0000000001c9 9089 1726776634.88136: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.151356", "end": "2024-09-19 16:10:34.828502", "rc": 0, "start": "2024-09-19 16:10:34.677146" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8218 1726776634.88622: no more pending results, returning what we have 8218 1726776634.88625: results queue empty 8218 1726776634.88626: checking for any_errors_fatal 8218 1726776634.88630: done checking for any_errors_fatal 8218 1726776634.88631: checking for max_fail_percentage 8218 1726776634.88632: done checking for max_fail_percentage 8218 1726776634.88633: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.88633: done checking to see if all hosts have failed 8218 1726776634.88634: getting the remaining hosts for this loop 8218 1726776634.88635: done getting the remaining hosts for this loop 8218 1726776634.88638: getting the next task for host managed_node2 8218 1726776634.88645: done getting next task for host managed_node2 8218 1726776634.88649: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8218 1726776634.88653: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.88663: getting variables 8218 1726776634.88665: in VariableManager get_vars() 8218 1726776634.88700: Calling all_inventory to load vars for managed_node2 8218 1726776634.88702: Calling groups_inventory to load vars for managed_node2 8218 1726776634.88705: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.88713: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.88716: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.88719: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.88836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.88960: done with get_vars() 8218 1726776634.88968: done getting variables 8218 1726776634.89035: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.589) 0:00:20.721 **** 8218 1726776634.89060: entering _queue_task() for managed_node2/shell 8218 1726776634.89061: Creating lock for shell 8218 1726776634.89223: worker is 1 (out of 1 available) 8218 1726776634.89238: exiting _queue_task() for managed_node2/shell 8218 1726776634.89249: done queuing things up, now waiting for results queue to drain 8218 1726776634.89251: waiting for pending results... 9100 1726776634.89399: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 9100 1726776634.89551: in run() - task 120fa90a-8a95-cec2-986e-0000000001ca 9100 1726776634.89568: variable 'ansible_search_path' from source: unknown 9100 1726776634.89572: variable 'ansible_search_path' from source: unknown 9100 1726776634.89603: calling self._execute() 9100 1726776634.89678: variable 'ansible_host' from source: host vars for 'managed_node2' 9100 1726776634.89688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9100 1726776634.89697: variable 'omit' from source: magic vars 9100 1726776634.90093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9100 1726776634.90263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9100 1726776634.90296: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9100 1726776634.90318: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9100 1726776634.90356: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9100 1726776634.90460: variable '__kernel_settings_register_verify_values' from source: set_fact 9100 1726776634.90483: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 9100 1726776634.90489: when evaluation is False, skipping this task 9100 1726776634.90493: _execute() done 9100 1726776634.90496: dumping result to json 9100 1726776634.90500: done dumping result, returning 9100 1726776634.90505: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [120fa90a-8a95-cec2-986e-0000000001ca] 9100 1726776634.90512: sending task result for task 120fa90a-8a95-cec2-986e-0000000001ca 9100 1726776634.90534: done sending task result for task 120fa90a-8a95-cec2-986e-0000000001ca 9100 1726776634.90538: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776634.90652: no more pending results, returning what we have 8218 1726776634.90655: results queue empty 8218 1726776634.90656: checking for any_errors_fatal 8218 1726776634.90661: done checking for any_errors_fatal 8218 1726776634.90661: checking for max_fail_percentage 8218 1726776634.90663: done checking for max_fail_percentage 8218 1726776634.90663: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.90664: done checking to see if all hosts have failed 8218 1726776634.90665: getting the remaining hosts for this loop 8218 1726776634.90666: done getting the remaining hosts for this loop 8218 1726776634.90668: getting the next task for host managed_node2 8218 1726776634.90673: done getting next task for host managed_node2 8218 1726776634.90676: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8218 1726776634.90680: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.90692: getting variables 8218 1726776634.90693: in VariableManager get_vars() 8218 1726776634.90721: Calling all_inventory to load vars for managed_node2 8218 1726776634.90724: Calling groups_inventory to load vars for managed_node2 8218 1726776634.90726: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.90735: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.90738: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.90741: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.90957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.91163: done with get_vars() 8218 1726776634.91172: done getting variables 8218 1726776634.91230: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.021) 0:00:20.743 **** 8218 1726776634.91261: entering _queue_task() for managed_node2/fail 8218 1726776634.91435: worker is 1 (out of 1 available) 8218 1726776634.91447: exiting _queue_task() for managed_node2/fail 8218 1726776634.91458: done queuing things up, now waiting for results queue to drain 8218 1726776634.91460: waiting for pending results... 9108 1726776634.91664: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 9108 1726776634.91808: in run() - task 120fa90a-8a95-cec2-986e-0000000001cb 9108 1726776634.91823: variable 'ansible_search_path' from source: unknown 9108 1726776634.91827: variable 'ansible_search_path' from source: unknown 9108 1726776634.91861: calling self._execute() 9108 1726776634.91946: variable 'ansible_host' from source: host vars for 'managed_node2' 9108 1726776634.91957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9108 1726776634.91965: variable 'omit' from source: magic vars 9108 1726776634.92392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9108 1726776634.92675: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9108 1726776634.92720: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9108 1726776634.92753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9108 1726776634.92790: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9108 1726776634.92894: variable '__kernel_settings_register_verify_values' from source: set_fact 9108 1726776634.92919: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 9108 1726776634.92924: when evaluation is False, skipping this task 9108 1726776634.92930: _execute() done 9108 1726776634.92934: dumping result to json 9108 1726776634.92938: done dumping result, returning 9108 1726776634.92944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [120fa90a-8a95-cec2-986e-0000000001cb] 9108 1726776634.92951: sending task result for task 120fa90a-8a95-cec2-986e-0000000001cb 9108 1726776634.92981: done sending task result for task 120fa90a-8a95-cec2-986e-0000000001cb 9108 1726776634.92987: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776634.93307: no more pending results, returning what we have 8218 1726776634.93311: results queue empty 8218 1726776634.93312: checking for any_errors_fatal 8218 1726776634.93316: done checking for any_errors_fatal 8218 1726776634.93317: checking for max_fail_percentage 8218 1726776634.93318: done checking for max_fail_percentage 8218 1726776634.93319: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.93319: done checking to see if all hosts have failed 8218 1726776634.93320: getting the remaining hosts for this loop 8218 1726776634.93321: done getting the remaining hosts for this loop 8218 1726776634.93324: getting the next task for host managed_node2 8218 1726776634.93333: done getting next task for host managed_node2 8218 1726776634.93336: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8218 1726776634.93339: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.93352: getting variables 8218 1726776634.93354: in VariableManager get_vars() 8218 1726776634.93381: Calling all_inventory to load vars for managed_node2 8218 1726776634.93387: Calling groups_inventory to load vars for managed_node2 8218 1726776634.93389: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.93397: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.93399: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.93402: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.93564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.93774: done with get_vars() 8218 1726776634.93786: done getting variables 8218 1726776634.93840: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.026) 0:00:20.769 **** 8218 1726776634.93869: entering _queue_task() for managed_node2/set_fact 8218 1726776634.94048: worker is 1 (out of 1 available) 8218 1726776634.94061: exiting _queue_task() for managed_node2/set_fact 8218 1726776634.94072: done queuing things up, now waiting for results queue to drain 8218 1726776634.94073: waiting for pending results... 9109 1726776634.94274: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 9109 1726776634.94394: in run() - task 120fa90a-8a95-cec2-986e-0000000000c3 9109 1726776634.94411: variable 'ansible_search_path' from source: unknown 9109 1726776634.94415: variable 'ansible_search_path' from source: unknown 9109 1726776634.94445: calling self._execute() 9109 1726776634.94519: variable 'ansible_host' from source: host vars for 'managed_node2' 9109 1726776634.94527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9109 1726776634.94537: variable 'omit' from source: magic vars 9109 1726776634.94624: variable 'omit' from source: magic vars 9109 1726776634.94668: variable 'omit' from source: magic vars 9109 1726776634.94699: variable 'omit' from source: magic vars 9109 1726776634.94736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9109 1726776634.94767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9109 1726776634.94789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9109 1726776634.94806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9109 1726776634.94819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9109 1726776634.94848: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9109 1726776634.94854: variable 'ansible_host' from source: host vars for 'managed_node2' 9109 1726776634.94859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9109 1726776634.94958: Set connection var ansible_connection to ssh 9109 1726776634.94966: Set connection var ansible_pipelining to False 9109 1726776634.94973: Set connection var ansible_timeout to 10 9109 1726776634.94980: Set connection var ansible_module_compression to ZIP_DEFLATED 9109 1726776634.94990: Set connection var ansible_shell_type to sh 9109 1726776634.94996: Set connection var ansible_shell_executable to /bin/sh 9109 1726776634.95014: variable 'ansible_shell_executable' from source: unknown 9109 1726776634.95019: variable 'ansible_connection' from source: unknown 9109 1726776634.95023: variable 'ansible_module_compression' from source: unknown 9109 1726776634.95026: variable 'ansible_shell_type' from source: unknown 9109 1726776634.95031: variable 'ansible_shell_executable' from source: unknown 9109 1726776634.95034: variable 'ansible_host' from source: host vars for 'managed_node2' 9109 1726776634.95037: variable 'ansible_pipelining' from source: unknown 9109 1726776634.95040: variable 'ansible_timeout' from source: unknown 9109 1726776634.95043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9109 1726776634.95241: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9109 1726776634.95254: variable 'omit' from source: magic vars 9109 1726776634.95260: starting attempt loop 9109 1726776634.95264: running the handler 9109 1726776634.95273: handler run complete 9109 1726776634.95282: attempt loop complete, returning result 9109 1726776634.95289: _execute() done 9109 1726776634.95291: dumping result to json 9109 1726776634.95294: done dumping result, returning 9109 1726776634.95300: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-0000000000c3] 9109 1726776634.95306: sending task result for task 120fa90a-8a95-cec2-986e-0000000000c3 9109 1726776634.95331: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000c3 9109 1726776634.95335: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8218 1726776634.95642: no more pending results, returning what we have 8218 1726776634.95645: results queue empty 8218 1726776634.95646: checking for any_errors_fatal 8218 1726776634.95651: done checking for any_errors_fatal 8218 1726776634.95652: checking for max_fail_percentage 8218 1726776634.95654: done checking for max_fail_percentage 8218 1726776634.95654: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.95655: done checking to see if all hosts have failed 8218 1726776634.95656: getting the remaining hosts for this loop 8218 1726776634.95657: done getting the remaining hosts for this loop 8218 1726776634.95660: getting the next task for host managed_node2 8218 1726776634.95665: done getting next task for host managed_node2 8218 1726776634.95668: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8218 1726776634.95671: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.95680: getting variables 8218 1726776634.95681: in VariableManager get_vars() 8218 1726776634.95710: Calling all_inventory to load vars for managed_node2 8218 1726776634.95713: Calling groups_inventory to load vars for managed_node2 8218 1726776634.95715: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.95723: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.95725: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.95728: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.95939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.96143: done with get_vars() 8218 1726776634.96153: done getting variables 8218 1726776634.96205: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.023) 0:00:20.792 **** 8218 1726776634.96234: entering _queue_task() for managed_node2/set_fact 8218 1726776634.96404: worker is 1 (out of 1 available) 8218 1726776634.96416: exiting _queue_task() for managed_node2/set_fact 8218 1726776634.96427: done queuing things up, now waiting for results queue to drain 8218 1726776634.96430: waiting for pending results... 9110 1726776634.96634: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 9110 1726776634.96758: in run() - task 120fa90a-8a95-cec2-986e-0000000000c4 9110 1726776634.96775: variable 'ansible_search_path' from source: unknown 9110 1726776634.96779: variable 'ansible_search_path' from source: unknown 9110 1726776634.96812: calling self._execute() 9110 1726776634.96889: variable 'ansible_host' from source: host vars for 'managed_node2' 9110 1726776634.96899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9110 1726776634.96908: variable 'omit' from source: magic vars 9110 1726776634.97005: variable 'omit' from source: magic vars 9110 1726776634.97050: variable 'omit' from source: magic vars 9110 1726776634.97405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9110 1726776634.97679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9110 1726776634.97726: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9110 1726776634.97760: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9110 1726776634.97796: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9110 1726776634.97930: variable '__kernel_settings_register_profile' from source: set_fact 9110 1726776634.97944: variable '__kernel_settings_register_mode' from source: set_fact 9110 1726776634.97951: variable '__kernel_settings_register_apply' from source: set_fact 9110 1726776634.97998: variable 'omit' from source: magic vars 9110 1726776634.98024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9110 1726776634.98076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9110 1726776634.98098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9110 1726776634.98114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9110 1726776634.98125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9110 1726776634.98154: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9110 1726776634.98161: variable 'ansible_host' from source: host vars for 'managed_node2' 9110 1726776634.98165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9110 1726776634.98262: Set connection var ansible_connection to ssh 9110 1726776634.98272: Set connection var ansible_pipelining to False 9110 1726776634.98279: Set connection var ansible_timeout to 10 9110 1726776634.98290: Set connection var ansible_module_compression to ZIP_DEFLATED 9110 1726776634.98297: Set connection var ansible_shell_type to sh 9110 1726776634.98303: Set connection var ansible_shell_executable to /bin/sh 9110 1726776634.98322: variable 'ansible_shell_executable' from source: unknown 9110 1726776634.98327: variable 'ansible_connection' from source: unknown 9110 1726776634.98332: variable 'ansible_module_compression' from source: unknown 9110 1726776634.98336: variable 'ansible_shell_type' from source: unknown 9110 1726776634.98339: variable 'ansible_shell_executable' from source: unknown 9110 1726776634.98342: variable 'ansible_host' from source: host vars for 'managed_node2' 9110 1726776634.98346: variable 'ansible_pipelining' from source: unknown 9110 1726776634.98349: variable 'ansible_timeout' from source: unknown 9110 1726776634.98353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9110 1726776634.98451: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9110 1726776634.98464: variable 'omit' from source: magic vars 9110 1726776634.98470: starting attempt loop 9110 1726776634.98473: running the handler 9110 1726776634.98487: handler run complete 9110 1726776634.98496: attempt loop complete, returning result 9110 1726776634.98500: _execute() done 9110 1726776634.98502: dumping result to json 9110 1726776634.98506: done dumping result, returning 9110 1726776634.98512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-cec2-986e-0000000000c4] 9110 1726776634.98519: sending task result for task 120fa90a-8a95-cec2-986e-0000000000c4 9110 1726776634.98546: done sending task result for task 120fa90a-8a95-cec2-986e-0000000000c4 9110 1726776634.98550: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8218 1726776634.98810: no more pending results, returning what we have 8218 1726776634.98813: results queue empty 8218 1726776634.98814: checking for any_errors_fatal 8218 1726776634.98818: done checking for any_errors_fatal 8218 1726776634.98819: checking for max_fail_percentage 8218 1726776634.98821: done checking for max_fail_percentage 8218 1726776634.98822: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.98823: done checking to see if all hosts have failed 8218 1726776634.98823: getting the remaining hosts for this loop 8218 1726776634.98824: done getting the remaining hosts for this loop 8218 1726776634.98827: getting the next task for host managed_node2 8218 1726776634.98837: done getting next task for host managed_node2 8218 1726776634.98839: ^ task is: TASK: meta (role_complete) 8218 1726776634.98841: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.98850: getting variables 8218 1726776634.98851: in VariableManager get_vars() 8218 1726776634.98880: Calling all_inventory to load vars for managed_node2 8218 1726776634.98883: Calling groups_inventory to load vars for managed_node2 8218 1726776634.98887: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.98896: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.98898: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.98901: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.99065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.99273: done with get_vars() 8218 1726776634.99283: done getting variables 8218 1726776634.99355: done queuing things up, now waiting for results queue to drain 8218 1726776634.99357: results queue empty 8218 1726776634.99358: checking for any_errors_fatal 8218 1726776634.99361: done checking for any_errors_fatal 8218 1726776634.99362: checking for max_fail_percentage 8218 1726776634.99362: done checking for max_fail_percentage 8218 1726776634.99367: checking to see if all hosts have failed and the running result is not ok 8218 1726776634.99368: done checking to see if all hosts have failed 8218 1726776634.99368: getting the remaining hosts for this loop 8218 1726776634.99369: done getting the remaining hosts for this loop 8218 1726776634.99371: getting the next task for host managed_node2 8218 1726776634.99374: done getting next task for host managed_node2 8218 1726776634.99376: ^ task is: TASK: Ensure kernel_settings_reboot_required is unset or undefined 8218 1726776634.99377: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776634.99379: getting variables 8218 1726776634.99379: in VariableManager get_vars() 8218 1726776634.99392: Calling all_inventory to load vars for managed_node2 8218 1726776634.99395: Calling groups_inventory to load vars for managed_node2 8218 1726776634.99396: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776634.99400: Calling all_plugins_play to load vars for managed_node2 8218 1726776634.99403: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776634.99405: Calling groups_plugins_play to load vars for managed_node2 8218 1726776634.99572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776634.99758: done with get_vars() 8218 1726776634.99765: done getting variables 8218 1726776634.99839: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure kernel_settings_reboot_required is unset or undefined] ************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:71 Thursday 19 September 2024 16:10:34 -0400 (0:00:00.036) 0:00:20.829 **** 8218 1726776634.99862: entering _queue_task() for managed_node2/assert 8218 1726776634.99863: Creating lock for assert 8218 1726776635.00169: worker is 1 (out of 1 available) 8218 1726776635.00182: exiting _queue_task() for managed_node2/assert 8218 1726776635.00196: done queuing things up, now waiting for results queue to drain 8218 1726776635.00198: waiting for pending results... 9111 1726776635.00400: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is unset or undefined 9111 1726776635.00498: in run() - task 120fa90a-8a95-cec2-986e-000000000010 9111 1726776635.00514: variable 'ansible_search_path' from source: unknown 9111 1726776635.00546: calling self._execute() 9111 1726776635.00619: variable 'ansible_host' from source: host vars for 'managed_node2' 9111 1726776635.00627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9111 1726776635.00636: variable 'omit' from source: magic vars 9111 1726776635.00728: variable 'omit' from source: magic vars 9111 1726776635.00758: variable 'omit' from source: magic vars 9111 1726776635.00789: variable 'omit' from source: magic vars 9111 1726776635.00824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9111 1726776635.00855: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9111 1726776635.00877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9111 1726776635.00897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9111 1726776635.00909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9111 1726776635.00937: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9111 1726776635.00944: variable 'ansible_host' from source: host vars for 'managed_node2' 9111 1726776635.00947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9111 1726776635.01090: Set connection var ansible_connection to ssh 9111 1726776635.01100: Set connection var ansible_pipelining to False 9111 1726776635.01106: Set connection var ansible_timeout to 10 9111 1726776635.01113: Set connection var ansible_module_compression to ZIP_DEFLATED 9111 1726776635.01118: Set connection var ansible_shell_type to sh 9111 1726776635.01123: Set connection var ansible_shell_executable to /bin/sh 9111 1726776635.01145: variable 'ansible_shell_executable' from source: unknown 9111 1726776635.01150: variable 'ansible_connection' from source: unknown 9111 1726776635.01152: variable 'ansible_module_compression' from source: unknown 9111 1726776635.01155: variable 'ansible_shell_type' from source: unknown 9111 1726776635.01157: variable 'ansible_shell_executable' from source: unknown 9111 1726776635.01160: variable 'ansible_host' from source: host vars for 'managed_node2' 9111 1726776635.01164: variable 'ansible_pipelining' from source: unknown 9111 1726776635.01166: variable 'ansible_timeout' from source: unknown 9111 1726776635.01170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9111 1726776635.01290: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9111 1726776635.01303: variable 'omit' from source: magic vars 9111 1726776635.01309: starting attempt loop 9111 1726776635.01312: running the handler 9111 1726776635.01630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9111 1726776635.03941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9111 1726776635.04015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9111 1726776635.04054: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9111 1726776635.04091: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9111 1726776635.04117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9111 1726776635.04180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9111 1726776635.04209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9111 1726776635.04235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9111 1726776635.04271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9111 1726776635.04286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9111 1726776635.04412: variable 'kernel_settings_reboot_required' from source: set_fact 9111 1726776635.04431: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 9111 1726776635.04438: handler run complete 9111 1726776635.04458: attempt loop complete, returning result 9111 1726776635.04462: _execute() done 9111 1726776635.04465: dumping result to json 9111 1726776635.04468: done dumping result, returning 9111 1726776635.04474: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is unset or undefined [120fa90a-8a95-cec2-986e-000000000010] 9111 1726776635.04479: sending task result for task 120fa90a-8a95-cec2-986e-000000000010 9111 1726776635.04507: done sending task result for task 120fa90a-8a95-cec2-986e-000000000010 9111 1726776635.04510: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776635.04845: no more pending results, returning what we have 8218 1726776635.04848: results queue empty 8218 1726776635.04848: checking for any_errors_fatal 8218 1726776635.04850: done checking for any_errors_fatal 8218 1726776635.04851: checking for max_fail_percentage 8218 1726776635.04852: done checking for max_fail_percentage 8218 1726776635.04853: checking to see if all hosts have failed and the running result is not ok 8218 1726776635.04854: done checking to see if all hosts have failed 8218 1726776635.04854: getting the remaining hosts for this loop 8218 1726776635.04855: done getting the remaining hosts for this loop 8218 1726776635.04858: getting the next task for host managed_node2 8218 1726776635.04863: done getting next task for host managed_node2 8218 1726776635.04864: ^ task is: TASK: Ensure role reported changed 8218 1726776635.04866: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776635.04869: getting variables 8218 1726776635.04870: in VariableManager get_vars() 8218 1726776635.04901: Calling all_inventory to load vars for managed_node2 8218 1726776635.04904: Calling groups_inventory to load vars for managed_node2 8218 1726776635.04906: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776635.04913: Calling all_plugins_play to load vars for managed_node2 8218 1726776635.04921: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776635.04923: Calling groups_plugins_play to load vars for managed_node2 8218 1726776635.05091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776635.05276: done with get_vars() 8218 1726776635.05289: done getting variables 8218 1726776635.05343: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:75 Thursday 19 September 2024 16:10:35 -0400 (0:00:00.055) 0:00:20.884 **** 8218 1726776635.05367: entering _queue_task() for managed_node2/assert 8218 1726776635.05544: worker is 1 (out of 1 available) 8218 1726776635.05556: exiting _queue_task() for managed_node2/assert 8218 1726776635.05566: done queuing things up, now waiting for results queue to drain 8218 1726776635.05568: waiting for pending results... 9112 1726776635.05773: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 9112 1726776635.05879: in run() - task 120fa90a-8a95-cec2-986e-000000000011 9112 1726776635.05900: variable 'ansible_search_path' from source: unknown 9112 1726776635.05933: calling self._execute() 9112 1726776635.06013: variable 'ansible_host' from source: host vars for 'managed_node2' 9112 1726776635.06023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9112 1726776635.06034: variable 'omit' from source: magic vars 9112 1726776635.06134: variable 'omit' from source: magic vars 9112 1726776635.06166: variable 'omit' from source: magic vars 9112 1726776635.06199: variable 'omit' from source: magic vars 9112 1726776635.06230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9112 1726776635.06264: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9112 1726776635.06287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9112 1726776635.06302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9112 1726776635.06313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9112 1726776635.06337: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9112 1726776635.06342: variable 'ansible_host' from source: host vars for 'managed_node2' 9112 1726776635.06346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9112 1726776635.06416: Set connection var ansible_connection to ssh 9112 1726776635.06424: Set connection var ansible_pipelining to False 9112 1726776635.06431: Set connection var ansible_timeout to 10 9112 1726776635.06439: Set connection var ansible_module_compression to ZIP_DEFLATED 9112 1726776635.06444: Set connection var ansible_shell_type to sh 9112 1726776635.06449: Set connection var ansible_shell_executable to /bin/sh 9112 1726776635.06464: variable 'ansible_shell_executable' from source: unknown 9112 1726776635.06468: variable 'ansible_connection' from source: unknown 9112 1726776635.06471: variable 'ansible_module_compression' from source: unknown 9112 1726776635.06475: variable 'ansible_shell_type' from source: unknown 9112 1726776635.06478: variable 'ansible_shell_executable' from source: unknown 9112 1726776635.06482: variable 'ansible_host' from source: host vars for 'managed_node2' 9112 1726776635.06488: variable 'ansible_pipelining' from source: unknown 9112 1726776635.06492: variable 'ansible_timeout' from source: unknown 9112 1726776635.06496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9112 1726776635.06588: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9112 1726776635.06599: variable 'omit' from source: magic vars 9112 1726776635.06604: starting attempt loop 9112 1726776635.06607: running the handler 9112 1726776635.06860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9112 1726776635.08716: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9112 1726776635.08791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9112 1726776635.08827: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9112 1726776635.08864: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9112 1726776635.08891: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9112 1726776635.08956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9112 1726776635.09000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9112 1726776635.09026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9112 1726776635.09068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9112 1726776635.09082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9112 1726776635.09187: variable '__kernel_settings_changed' from source: set_fact 9112 1726776635.09205: Evaluated conditional (__kernel_settings_changed | d(false)): True 9112 1726776635.09213: handler run complete 9112 1726776635.09236: attempt loop complete, returning result 9112 1726776635.09241: _execute() done 9112 1726776635.09245: dumping result to json 9112 1726776635.09248: done dumping result, returning 9112 1726776635.09255: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [120fa90a-8a95-cec2-986e-000000000011] 9112 1726776635.09261: sending task result for task 120fa90a-8a95-cec2-986e-000000000011 9112 1726776635.09292: done sending task result for task 120fa90a-8a95-cec2-986e-000000000011 9112 1726776635.09297: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776635.09671: no more pending results, returning what we have 8218 1726776635.09674: results queue empty 8218 1726776635.09675: checking for any_errors_fatal 8218 1726776635.09681: done checking for any_errors_fatal 8218 1726776635.09682: checking for max_fail_percentage 8218 1726776635.09686: done checking for max_fail_percentage 8218 1726776635.09687: checking to see if all hosts have failed and the running result is not ok 8218 1726776635.09688: done checking to see if all hosts have failed 8218 1726776635.09688: getting the remaining hosts for this loop 8218 1726776635.09689: done getting the remaining hosts for this loop 8218 1726776635.09693: getting the next task for host managed_node2 8218 1726776635.09698: done getting next task for host managed_node2 8218 1726776635.09700: ^ task is: TASK: Check sysfs after role runs 8218 1726776635.09702: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776635.09705: getting variables 8218 1726776635.09706: in VariableManager get_vars() 8218 1726776635.09738: Calling all_inventory to load vars for managed_node2 8218 1726776635.09742: Calling groups_inventory to load vars for managed_node2 8218 1726776635.09744: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776635.09753: Calling all_plugins_play to load vars for managed_node2 8218 1726776635.09756: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776635.09764: Calling groups_plugins_play to load vars for managed_node2 8218 1726776635.09935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776635.10188: done with get_vars() 8218 1726776635.10198: done getting variables 8218 1726776635.10255: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:79 Thursday 19 September 2024 16:10:35 -0400 (0:00:00.049) 0:00:20.933 **** 8218 1726776635.10280: entering _queue_task() for managed_node2/command 8218 1726776635.10467: worker is 1 (out of 1 available) 8218 1726776635.10481: exiting _queue_task() for managed_node2/command 8218 1726776635.10495: done queuing things up, now waiting for results queue to drain 8218 1726776635.10496: waiting for pending results... 9117 1726776635.10703: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 9117 1726776635.10807: in run() - task 120fa90a-8a95-cec2-986e-000000000012 9117 1726776635.10824: variable 'ansible_search_path' from source: unknown 9117 1726776635.10858: calling self._execute() 9117 1726776635.10940: variable 'ansible_host' from source: host vars for 'managed_node2' 9117 1726776635.10949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9117 1726776635.10958: variable 'omit' from source: magic vars 9117 1726776635.11055: variable 'omit' from source: magic vars 9117 1726776635.11089: variable 'omit' from source: magic vars 9117 1726776635.11120: variable 'omit' from source: magic vars 9117 1726776635.11162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9117 1726776635.11196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9117 1726776635.11218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9117 1726776635.11238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9117 1726776635.11252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9117 1726776635.11280: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9117 1726776635.11289: variable 'ansible_host' from source: host vars for 'managed_node2' 9117 1726776635.11294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9117 1726776635.11393: Set connection var ansible_connection to ssh 9117 1726776635.11401: Set connection var ansible_pipelining to False 9117 1726776635.11408: Set connection var ansible_timeout to 10 9117 1726776635.11417: Set connection var ansible_module_compression to ZIP_DEFLATED 9117 1726776635.11422: Set connection var ansible_shell_type to sh 9117 1726776635.11427: Set connection var ansible_shell_executable to /bin/sh 9117 1726776635.11448: variable 'ansible_shell_executable' from source: unknown 9117 1726776635.11453: variable 'ansible_connection' from source: unknown 9117 1726776635.11456: variable 'ansible_module_compression' from source: unknown 9117 1726776635.11460: variable 'ansible_shell_type' from source: unknown 9117 1726776635.11463: variable 'ansible_shell_executable' from source: unknown 9117 1726776635.11466: variable 'ansible_host' from source: host vars for 'managed_node2' 9117 1726776635.11469: variable 'ansible_pipelining' from source: unknown 9117 1726776635.11472: variable 'ansible_timeout' from source: unknown 9117 1726776635.11476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9117 1726776635.11672: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9117 1726776635.11688: variable 'omit' from source: magic vars 9117 1726776635.11695: starting attempt loop 9117 1726776635.11699: running the handler 9117 1726776635.11713: _low_level_execute_command(): starting 9117 1726776635.11721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9117 1726776635.14436: stdout chunk (state=2): >>>/root <<< 9117 1726776635.14696: stderr chunk (state=3): >>><<< 9117 1726776635.14703: stdout chunk (state=3): >>><<< 9117 1726776635.14723: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9117 1726776635.14739: _low_level_execute_command(): starting 9117 1726776635.14746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612 `" && echo ansible-tmp-1726776635.1473348-9117-66917295843612="` echo /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612 `" ) && sleep 0' 9117 1726776635.17468: stdout chunk (state=2): >>>ansible-tmp-1726776635.1473348-9117-66917295843612=/root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612 <<< 9117 1726776635.17610: stderr chunk (state=3): >>><<< 9117 1726776635.17620: stdout chunk (state=3): >>><<< 9117 1726776635.17640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776635.1473348-9117-66917295843612=/root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612 , stderr= 9117 1726776635.17668: variable 'ansible_module_compression' from source: unknown 9117 1726776635.17723: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9117 1726776635.17761: variable 'ansible_facts' from source: unknown 9117 1726776635.17869: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612/AnsiballZ_command.py 9117 1726776635.18381: Sending initial data 9117 1726776635.18388: Sent initial data (153 bytes) 9117 1726776635.21159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp4qz1x4t8 /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612/AnsiballZ_command.py <<< 9117 1726776635.22561: stderr chunk (state=3): >>><<< 9117 1726776635.22570: stdout chunk (state=3): >>><<< 9117 1726776635.22592: done transferring module to remote 9117 1726776635.22604: _low_level_execute_command(): starting 9117 1726776635.22612: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612/ /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612/AnsiballZ_command.py && sleep 0' 9117 1726776635.25938: stderr chunk (state=2): >>><<< 9117 1726776635.25947: stdout chunk (state=2): >>><<< 9117 1726776635.25963: _low_level_execute_command() done: rc=0, stdout=, stderr= 9117 1726776635.25968: _low_level_execute_command(): starting 9117 1726776635.25973: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612/AnsiballZ_command.py && sleep 0' 9117 1726776635.41918: stdout chunk (state=2): >>> {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:10:35.415185", "end": "2024-09-19 16:10:35.418488", "delta": "0:00:00.003303", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9117 1726776635.43173: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9117 1726776635.43186: stdout chunk (state=3): >>><<< 9117 1726776635.43200: stderr chunk (state=3): >>><<< 9117 1726776635.43215: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:10:35.415185", "end": "2024-09-19 16:10:35.418488", "delta": "0:00:00.003303", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 9117 1726776635.43271: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -x 65000 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9117 1726776635.43282: _low_level_execute_command(): starting 9117 1726776635.43289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776635.1473348-9117-66917295843612/ > /dev/null 2>&1 && sleep 0' 9117 1726776635.45968: stderr chunk (state=2): >>><<< 9117 1726776635.45976: stdout chunk (state=2): >>><<< 9117 1726776635.45991: _low_level_execute_command() done: rc=0, stdout=, stderr= 9117 1726776635.45999: handler run complete 9117 1726776635.46021: Evaluated conditional (False): False 9117 1726776635.46035: attempt loop complete, returning result 9117 1726776635.46040: _execute() done 9117 1726776635.46042: dumping result to json 9117 1726776635.46048: done dumping result, returning 9117 1726776635.46054: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [120fa90a-8a95-cec2-986e-000000000012] 9117 1726776635.46061: sending task result for task 120fa90a-8a95-cec2-986e-000000000012 9117 1726776635.46101: done sending task result for task 120fa90a-8a95-cec2-986e-000000000012 9117 1726776635.46106: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "65000", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003303", "end": "2024-09-19 16:10:35.418488", "rc": 0, "start": "2024-09-19 16:10:35.415185" } STDOUT: 65000 8218 1726776635.46487: no more pending results, returning what we have 8218 1726776635.46490: results queue empty 8218 1726776635.46491: checking for any_errors_fatal 8218 1726776635.46495: done checking for any_errors_fatal 8218 1726776635.46496: checking for max_fail_percentage 8218 1726776635.46497: done checking for max_fail_percentage 8218 1726776635.46498: checking to see if all hosts have failed and the running result is not ok 8218 1726776635.46499: done checking to see if all hosts have failed 8218 1726776635.46499: getting the remaining hosts for this loop 8218 1726776635.46501: done getting the remaining hosts for this loop 8218 1726776635.46504: getting the next task for host managed_node2 8218 1726776635.46508: done getting next task for host managed_node2 8218 1726776635.46510: ^ task is: TASK: Check sysctl after role runs 8218 1726776635.46512: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776635.46515: getting variables 8218 1726776635.46516: in VariableManager get_vars() 8218 1726776635.46550: Calling all_inventory to load vars for managed_node2 8218 1726776635.46553: Calling groups_inventory to load vars for managed_node2 8218 1726776635.46555: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776635.46564: Calling all_plugins_play to load vars for managed_node2 8218 1726776635.46567: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776635.46570: Calling groups_plugins_play to load vars for managed_node2 8218 1726776635.46738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776635.46928: done with get_vars() 8218 1726776635.46942: done getting variables 8218 1726776635.46997: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysctl after role runs] ******************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:83 Thursday 19 September 2024 16:10:35 -0400 (0:00:00.367) 0:00:21.300 **** 8218 1726776635.47024: entering _queue_task() for managed_node2/shell 8218 1726776635.47215: worker is 1 (out of 1 available) 8218 1726776635.47230: exiting _queue_task() for managed_node2/shell 8218 1726776635.47240: done queuing things up, now waiting for results queue to drain 8218 1726776635.47242: waiting for pending results... 9138 1726776635.47536: running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs 9138 1726776635.47636: in run() - task 120fa90a-8a95-cec2-986e-000000000013 9138 1726776635.47652: variable 'ansible_search_path' from source: unknown 9138 1726776635.47683: calling self._execute() 9138 1726776635.47762: variable 'ansible_host' from source: host vars for 'managed_node2' 9138 1726776635.47771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9138 1726776635.47780: variable 'omit' from source: magic vars 9138 1726776635.47877: variable 'omit' from source: magic vars 9138 1726776635.47907: variable 'omit' from source: magic vars 9138 1726776635.47939: variable 'omit' from source: magic vars 9138 1726776635.47978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9138 1726776635.48010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9138 1726776635.48034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9138 1726776635.48050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9138 1726776635.48062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9138 1726776635.48089: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9138 1726776635.48095: variable 'ansible_host' from source: host vars for 'managed_node2' 9138 1726776635.48099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9138 1726776635.48192: Set connection var ansible_connection to ssh 9138 1726776635.48201: Set connection var ansible_pipelining to False 9138 1726776635.48208: Set connection var ansible_timeout to 10 9138 1726776635.48215: Set connection var ansible_module_compression to ZIP_DEFLATED 9138 1726776635.48221: Set connection var ansible_shell_type to sh 9138 1726776635.48226: Set connection var ansible_shell_executable to /bin/sh 9138 1726776635.48247: variable 'ansible_shell_executable' from source: unknown 9138 1726776635.48252: variable 'ansible_connection' from source: unknown 9138 1726776635.48255: variable 'ansible_module_compression' from source: unknown 9138 1726776635.48259: variable 'ansible_shell_type' from source: unknown 9138 1726776635.48261: variable 'ansible_shell_executable' from source: unknown 9138 1726776635.48264: variable 'ansible_host' from source: host vars for 'managed_node2' 9138 1726776635.48268: variable 'ansible_pipelining' from source: unknown 9138 1726776635.48270: variable 'ansible_timeout' from source: unknown 9138 1726776635.48274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9138 1726776635.48390: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9138 1726776635.48403: variable 'omit' from source: magic vars 9138 1726776635.48409: starting attempt loop 9138 1726776635.48412: running the handler 9138 1726776635.48420: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9138 1726776635.48538: _low_level_execute_command(): starting 9138 1726776635.48551: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9138 1726776635.51207: stdout chunk (state=2): >>>/root <<< 9138 1726776635.51341: stderr chunk (state=3): >>><<< 9138 1726776635.51349: stdout chunk (state=3): >>><<< 9138 1726776635.51367: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9138 1726776635.51381: _low_level_execute_command(): starting 9138 1726776635.51386: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746 `" && echo ansible-tmp-1726776635.5137513-9138-90635020758746="` echo /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746 `" ) && sleep 0' 9138 1726776635.54124: stdout chunk (state=2): >>>ansible-tmp-1726776635.5137513-9138-90635020758746=/root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746 <<< 9138 1726776635.54268: stderr chunk (state=3): >>><<< 9138 1726776635.54277: stdout chunk (state=3): >>><<< 9138 1726776635.54295: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776635.5137513-9138-90635020758746=/root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746 , stderr= 9138 1726776635.54325: variable 'ansible_module_compression' from source: unknown 9138 1726776635.54382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9138 1726776635.54417: variable 'ansible_facts' from source: unknown 9138 1726776635.54525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746/AnsiballZ_command.py 9138 1726776635.54970: Sending initial data 9138 1726776635.54978: Sent initial data (153 bytes) 9138 1726776635.57503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmporik52l3 /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746/AnsiballZ_command.py <<< 9138 1726776635.59080: stderr chunk (state=3): >>><<< 9138 1726776635.59090: stdout chunk (state=3): >>><<< 9138 1726776635.59112: done transferring module to remote 9138 1726776635.59124: _low_level_execute_command(): starting 9138 1726776635.59132: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746/ /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746/AnsiballZ_command.py && sleep 0' 9138 1726776635.61968: stderr chunk (state=2): >>><<< 9138 1726776635.61979: stdout chunk (state=2): >>><<< 9138 1726776635.61996: _low_level_execute_command() done: rc=0, stdout=, stderr= 9138 1726776635.62001: _low_level_execute_command(): starting 9138 1726776635.62007: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746/AnsiballZ_command.py && sleep 0' 9138 1726776635.78558: stdout chunk (state=2): >>> {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 16:10:35.774367", "end": "2024-09-19 16:10:35.783969", "delta": "0:00:00.009602", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9138 1726776635.80039: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9138 1726776635.80087: stderr chunk (state=3): >>><<< 9138 1726776635.80097: stdout chunk (state=3): >>><<< 9138 1726776635.80114: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 16:10:35.774367", "end": "2024-09-19 16:10:35.783969", "delta": "0:00:00.009602", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 9138 1726776635.80238: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9138 1726776635.80250: _low_level_execute_command(): starting 9138 1726776635.80256: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776635.5137513-9138-90635020758746/ > /dev/null 2>&1 && sleep 0' 9138 1726776635.82921: stderr chunk (state=2): >>><<< 9138 1726776635.82931: stdout chunk (state=2): >>><<< 9138 1726776635.82947: _low_level_execute_command() done: rc=0, stdout=, stderr= 9138 1726776635.82955: handler run complete 9138 1726776635.82977: Evaluated conditional (False): False 9138 1726776635.82990: attempt loop complete, returning result 9138 1726776635.82994: _execute() done 9138 1726776635.82997: dumping result to json 9138 1726776635.83002: done dumping result, returning 9138 1726776635.83009: done running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs [120fa90a-8a95-cec2-986e-000000000013] 9138 1726776635.83016: sending task result for task 120fa90a-8a95-cec2-986e-000000000013 9138 1726776635.83057: done sending task result for task 120fa90a-8a95-cec2-986e-000000000013 9138 1726776635.83062: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "delta": "0:00:00.009602", "end": "2024-09-19 16:10:35.783969", "rc": 0, "start": "2024-09-19 16:10:35.774367" } STDOUT: 400000 8218 1726776635.83439: no more pending results, returning what we have 8218 1726776635.83441: results queue empty 8218 1726776635.83442: checking for any_errors_fatal 8218 1726776635.83449: done checking for any_errors_fatal 8218 1726776635.83450: checking for max_fail_percentage 8218 1726776635.83451: done checking for max_fail_percentage 8218 1726776635.83452: checking to see if all hosts have failed and the running result is not ok 8218 1726776635.83453: done checking to see if all hosts have failed 8218 1726776635.83453: getting the remaining hosts for this loop 8218 1726776635.83454: done getting the remaining hosts for this loop 8218 1726776635.83457: getting the next task for host managed_node2 8218 1726776635.83461: done getting next task for host managed_node2 8218 1726776635.83463: ^ task is: TASK: Check sysctl after role runs 8218 1726776635.83465: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776635.83468: getting variables 8218 1726776635.83470: in VariableManager get_vars() 8218 1726776635.83505: Calling all_inventory to load vars for managed_node2 8218 1726776635.83508: Calling groups_inventory to load vars for managed_node2 8218 1726776635.83510: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776635.83519: Calling all_plugins_play to load vars for managed_node2 8218 1726776635.83522: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776635.83524: Calling groups_plugins_play to load vars for managed_node2 8218 1726776635.83752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776635.83949: done with get_vars() 8218 1726776635.83960: done getting variables 8218 1726776635.84020: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysctl after role runs] ******************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:89 Thursday 19 September 2024 16:10:35 -0400 (0:00:00.370) 0:00:21.671 **** 8218 1726776635.84047: entering _queue_task() for managed_node2/shell 8218 1726776635.84238: worker is 1 (out of 1 available) 8218 1726776635.84251: exiting _queue_task() for managed_node2/shell 8218 1726776635.84262: done queuing things up, now waiting for results queue to drain 8218 1726776635.84264: waiting for pending results... 9166 1726776635.84541: running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs 9166 1726776635.84647: in run() - task 120fa90a-8a95-cec2-986e-000000000014 9166 1726776635.84664: variable 'ansible_search_path' from source: unknown 9166 1726776635.84701: calling self._execute() 9166 1726776635.84782: variable 'ansible_host' from source: host vars for 'managed_node2' 9166 1726776635.84794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9166 1726776635.84802: variable 'omit' from source: magic vars 9166 1726776635.84897: variable 'omit' from source: magic vars 9166 1726776635.84927: variable 'omit' from source: magic vars 9166 1726776635.84960: variable 'omit' from source: magic vars 9166 1726776635.85005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9166 1726776635.85038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9166 1726776635.85061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9166 1726776635.85078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9166 1726776635.85094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9166 1726776635.85121: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9166 1726776635.85127: variable 'ansible_host' from source: host vars for 'managed_node2' 9166 1726776635.85134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9166 1726776635.85230: Set connection var ansible_connection to ssh 9166 1726776635.85239: Set connection var ansible_pipelining to False 9166 1726776635.85246: Set connection var ansible_timeout to 10 9166 1726776635.85254: Set connection var ansible_module_compression to ZIP_DEFLATED 9166 1726776635.85260: Set connection var ansible_shell_type to sh 9166 1726776635.85265: Set connection var ansible_shell_executable to /bin/sh 9166 1726776635.85283: variable 'ansible_shell_executable' from source: unknown 9166 1726776635.85291: variable 'ansible_connection' from source: unknown 9166 1726776635.85295: variable 'ansible_module_compression' from source: unknown 9166 1726776635.85298: variable 'ansible_shell_type' from source: unknown 9166 1726776635.85301: variable 'ansible_shell_executable' from source: unknown 9166 1726776635.85303: variable 'ansible_host' from source: host vars for 'managed_node2' 9166 1726776635.85307: variable 'ansible_pipelining' from source: unknown 9166 1726776635.85309: variable 'ansible_timeout' from source: unknown 9166 1726776635.85313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9166 1726776635.85514: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9166 1726776635.85527: variable 'omit' from source: magic vars 9166 1726776635.85537: starting attempt loop 9166 1726776635.85542: running the handler 9166 1726776635.85551: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9166 1726776635.85567: _low_level_execute_command(): starting 9166 1726776635.85574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9166 1726776635.88500: stdout chunk (state=2): >>>/root <<< 9166 1726776635.88643: stderr chunk (state=3): >>><<< 9166 1726776635.88651: stdout chunk (state=3): >>><<< 9166 1726776635.88672: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9166 1726776635.88686: _low_level_execute_command(): starting 9166 1726776635.88693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624 `" && echo ansible-tmp-1726776635.8868082-9166-114691910428624="` echo /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624 `" ) && sleep 0' 9166 1726776635.91621: stdout chunk (state=2): >>>ansible-tmp-1726776635.8868082-9166-114691910428624=/root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624 <<< 9166 1726776635.91769: stderr chunk (state=3): >>><<< 9166 1726776635.91776: stdout chunk (state=3): >>><<< 9166 1726776635.91791: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776635.8868082-9166-114691910428624=/root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624 , stderr= 9166 1726776635.91818: variable 'ansible_module_compression' from source: unknown 9166 1726776635.91876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9166 1726776635.91910: variable 'ansible_facts' from source: unknown 9166 1726776635.92017: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624/AnsiballZ_command.py 9166 1726776635.92463: Sending initial data 9166 1726776635.92469: Sent initial data (154 bytes) 9166 1726776635.94941: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp4v9m4df2 /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624/AnsiballZ_command.py <<< 9166 1726776635.96308: stderr chunk (state=3): >>><<< 9166 1726776635.96317: stdout chunk (state=3): >>><<< 9166 1726776635.96338: done transferring module to remote 9166 1726776635.96349: _low_level_execute_command(): starting 9166 1726776635.96354: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624/ /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624/AnsiballZ_command.py && sleep 0' 9166 1726776635.99622: stderr chunk (state=2): >>><<< 9166 1726776635.99634: stdout chunk (state=2): >>><<< 9166 1726776635.99652: _low_level_execute_command() done: rc=0, stdout=, stderr= 9166 1726776635.99658: _low_level_execute_command(): starting 9166 1726776635.99664: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624/AnsiballZ_command.py && sleep 0' 9166 1726776636.16774: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 16:10:36.152550", "end": "2024-09-19 16:10:36.158717", "delta": "0:00:00.006167", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9166 1726776636.17321: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9166 1726776636.17333: stdout chunk (state=3): >>><<< 9166 1726776636.17348: stderr chunk (state=3): >>><<< 9166 1726776636.17363: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 16:10:36.152550", "end": "2024-09-19 16:10:36.158717", "delta": "0:00:00.006167", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 9166 1726776636.17417: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9166 1726776636.17430: _low_level_execute_command(): starting 9166 1726776636.17437: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776635.8868082-9166-114691910428624/ > /dev/null 2>&1 && sleep 0' 9166 1726776636.22595: stderr chunk (state=2): >>><<< 9166 1726776636.22606: stdout chunk (state=2): >>><<< 9166 1726776636.22623: _low_level_execute_command() done: rc=0, stdout=, stderr= 9166 1726776636.22632: handler run complete 9166 1726776636.22656: Evaluated conditional (False): False 9166 1726776636.22668: attempt loop complete, returning result 9166 1726776636.22672: _execute() done 9166 1726776636.22675: dumping result to json 9166 1726776636.22679: done dumping result, returning 9166 1726776636.22689: done running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs [120fa90a-8a95-cec2-986e-000000000014] 9166 1726776636.22697: sending task result for task 120fa90a-8a95-cec2-986e-000000000014 9166 1726776636.22739: done sending task result for task 120fa90a-8a95-cec2-986e-000000000014 9166 1726776636.22744: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "delta": "0:00:00.006167", "end": "2024-09-19 16:10:36.158717", "rc": 0, "start": "2024-09-19 16:10:36.152550" } 8218 1726776636.24090: no more pending results, returning what we have 8218 1726776636.24093: results queue empty 8218 1726776636.24094: checking for any_errors_fatal 8218 1726776636.24101: done checking for any_errors_fatal 8218 1726776636.24102: checking for max_fail_percentage 8218 1726776636.24103: done checking for max_fail_percentage 8218 1726776636.24104: checking to see if all hosts have failed and the running result is not ok 8218 1726776636.24105: done checking to see if all hosts have failed 8218 1726776636.24105: getting the remaining hosts for this loop 8218 1726776636.24107: done getting the remaining hosts for this loop 8218 1726776636.24110: getting the next task for host managed_node2 8218 1726776636.24115: done getting next task for host managed_node2 8218 1726776636.24117: ^ task is: TASK: Reboot the machine - see if settings persist after reboot 8218 1726776636.24119: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776636.24122: getting variables 8218 1726776636.24123: in VariableManager get_vars() 8218 1726776636.24159: Calling all_inventory to load vars for managed_node2 8218 1726776636.24162: Calling groups_inventory to load vars for managed_node2 8218 1726776636.24164: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776636.24173: Calling all_plugins_play to load vars for managed_node2 8218 1726776636.24176: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776636.24179: Calling groups_plugins_play to load vars for managed_node2 8218 1726776636.24347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776636.24533: done with get_vars() 8218 1726776636.24546: done getting variables 8218 1726776636.24603: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Reboot the machine - see if settings persist after reboot] *************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:95 Thursday 19 September 2024 16:10:36 -0400 (0:00:00.405) 0:00:22.076 **** 8218 1726776636.24633: entering _queue_task() for managed_node2/reboot 8218 1726776636.24841: worker is 1 (out of 1 available) 8218 1726776636.24855: exiting _queue_task() for managed_node2/reboot 8218 1726776636.24866: done queuing things up, now waiting for results queue to drain 8218 1726776636.24868: waiting for pending results... 9197 1726776636.25409: running TaskExecutor() for managed_node2/TASK: Reboot the machine - see if settings persist after reboot 9197 1726776636.25519: in run() - task 120fa90a-8a95-cec2-986e-000000000015 9197 1726776636.25537: variable 'ansible_search_path' from source: unknown 9197 1726776636.25570: calling self._execute() 9197 1726776636.25651: variable 'ansible_host' from source: host vars for 'managed_node2' 9197 1726776636.25661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9197 1726776636.25670: variable 'omit' from source: magic vars 9197 1726776636.25767: variable 'omit' from source: magic vars 9197 1726776636.25801: variable 'omit' from source: magic vars 9197 1726776636.25836: variable 'omit' from source: magic vars 9197 1726776636.25878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9197 1726776636.25913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9197 1726776636.25939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9197 1726776636.25956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9197 1726776636.25969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9197 1726776636.26001: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9197 1726776636.26008: variable 'ansible_host' from source: host vars for 'managed_node2' 9197 1726776636.26012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9197 1726776636.26110: Set connection var ansible_connection to ssh 9197 1726776636.26119: Set connection var ansible_pipelining to False 9197 1726776636.26126: Set connection var ansible_timeout to 10 9197 1726776636.26136: Set connection var ansible_module_compression to ZIP_DEFLATED 9197 1726776636.26142: Set connection var ansible_shell_type to sh 9197 1726776636.26147: Set connection var ansible_shell_executable to /bin/sh 9197 1726776636.26166: variable 'ansible_shell_executable' from source: unknown 9197 1726776636.26172: variable 'ansible_connection' from source: unknown 9197 1726776636.26175: variable 'ansible_module_compression' from source: unknown 9197 1726776636.26178: variable 'ansible_shell_type' from source: unknown 9197 1726776636.26181: variable 'ansible_shell_executable' from source: unknown 9197 1726776636.26184: variable 'ansible_host' from source: host vars for 'managed_node2' 9197 1726776636.26190: variable 'ansible_pipelining' from source: unknown 9197 1726776636.26193: variable 'ansible_timeout' from source: unknown 9197 1726776636.26197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9197 1726776636.26395: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 9197 1726776636.26411: variable 'omit' from source: magic vars 9197 1726776636.26417: starting attempt loop 9197 1726776636.26420: running the handler 9197 1726776636.26427: reboot: running setup module to get distribution 9197 1726776636.26440: _low_level_execute_command(): starting 9197 1726776636.26448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9197 1726776636.33352: stdout chunk (state=2): >>>/root <<< 9197 1726776636.33367: stderr chunk (state=2): >>><<< 9197 1726776636.33382: stdout chunk (state=3): >>><<< 9197 1726776636.33405: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9197 1726776636.33420: _low_level_execute_command(): starting 9197 1726776636.33427: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368 `" && echo ansible-tmp-1726776636.3341432-9197-12738887004368="` echo /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368 `" ) && sleep 0' 9197 1726776636.37134: stdout chunk (state=2): >>>ansible-tmp-1726776636.3341432-9197-12738887004368=/root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368 <<< 9197 1726776636.37885: stderr chunk (state=3): >>><<< 9197 1726776636.37894: stdout chunk (state=3): >>><<< 9197 1726776636.37912: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776636.3341432-9197-12738887004368=/root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368 , stderr= 9197 1726776636.37945: variable 'ansible_module_compression' from source: unknown 9197 1726776636.37999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9197 1726776636.38057: variable 'ansible_facts' from source: unknown 9197 1726776636.38290: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_setup.py 9197 1726776636.38749: Sending initial data 9197 1726776636.38755: Sent initial data (151 bytes) 9197 1726776636.42154: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp65tpms37 /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_setup.py <<< 9197 1726776636.46238: stderr chunk (state=3): >>><<< 9197 1726776636.46247: stdout chunk (state=3): >>><<< 9197 1726776636.46272: done transferring module to remote 9197 1726776636.46288: _low_level_execute_command(): starting 9197 1726776636.46295: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/ /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_setup.py && sleep 0' 9197 1726776636.49402: stderr chunk (state=2): >>><<< 9197 1726776636.49412: stdout chunk (state=2): >>><<< 9197 1726776636.49428: _low_level_execute_command() done: rc=0, stdout=, stderr= 9197 1726776636.49437: _low_level_execute_command(): starting 9197 1726776636.49443: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_setup.py && sleep 0' 9197 1726776636.79436: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-75", "ansible_nodename": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "f9cbf545a7bd4357ac99f30c9cf5a21a", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.10.210 50220 10.31.12.75 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.10.210 50220 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANE82uVxtVfIhrJuj+Z56Gg9bJlDR+yYhhx40nddz9Kp8spHmHNvFWFn+e7QAGUfCb+6Bn9hYTJ01jfHaC5ohG1MVnAda9CG+H9c6PQ+gHSYPJWdLw/SGnwyt5N6bir8gJvf5eqrXg0FFbm02VDZJAH4ww7gBr9WPGM4PY1Xusd3AAAAFQDMFufFoivmmHcgCnY/kt+ytzmOaQAAAIBcsxi9kZhiwOrK7psJGYdQI1cVUnaqKMfAmHz2cWmhI4jYrMQOwdMF3XQMyXgmrePEWFnuov5VbepbLu43oTrQo18/5uhe6kek0DuOeKivfAx8E4a6lh3OiSNw8mu5dYVcLv+bd4Kj97aZb9Gc715QJAj3ImLk7gMK0nFbaUkdZAAAAIA7wufmEs3LK2y8ttz87wJ4frWgcvNvSRJjeZACpPTicryWGrcOtjvdBeYguJ9vlncJisC4nPK3GYKg7yxbmiWL5TPmvQTT6fsy7cLlKkmIbtui/icHcNPTfBXqvJa3ynXTEfNrbid/WOzdTSO0utdrr4LeOgfnqsuif0W/n1CZ7A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCnzq1r0NFAn93EqXQx38G6hyHP8YT9vzeAiqkNxeexjxvgmckt0NM6KskMEaUcog0Gj/4V2FN7mOfj0Gbefb5fespgQ1eR63jDKf/JPDIQ3nN0RnnM77fe7kBba4QXYVDpY7zYR2LorYUWCCWu9hRdsPJzsf22DmXyyZ96gipq1Rg0VnVbyg7oorYufErgnBCXQtgE9Ffr9z2J40csh9GmrmoV5X8JZoSUWePkZXOackxaNHMOBPI5rzUnEjaTP5xZpIc6YAHBQKY0YRocgPk4Bdyku9zRtEeumHFPIvJCfr2mPjdStySjjtKdbJqbKfma3xtz0jVbVoTfltqc6+h5y99ObN99DAe1mmhzp6DrddXYqxdnhd9DaKzrbLE3uRGmSBa2rjmreEjaI+RKLtUwgeWHjgfJQG2OPki9a2VcHfLhBaH0TZhuW7CTEycsVb5YIgqDNdlnV3KDx/cWFHF9q9elURUwgyrD28mszD7cnZEMBmsfdPa3heOOGmZaqkM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNjw/e9P8QOb520cgNDhZpEn2NWDALjpSK9lpaLEOJsey1fh+RV1Bkt5jHN+4WHIvyqRDGP8roN+DvPf+K8h9oI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMHfdyzKk5Ao/6nDckScEFXouLZL8ZutL+VmnNfL8fQj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "16", "minute": "10", "second": "36", "epoch": "1726776636", "epoch_int": "1726776636", "date": "2024-09-19", "time": "16:10:36", "iso8601_micro": "2024-09-19T20:10:36.779207Z", "iso8601": "2024-09-19T20:10:36Z", "iso8601_basic": "20240919T161036779207", "iso8601_basic_short": "20240919T161036", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 9197 1726776636.80600: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9197 1726776636.80612: stdout chunk (state=3): >>><<< 9197 1726776636.80622: stderr chunk (state=3): >>><<< 9197 1726776636.80642: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-75", "ansible_nodename": "ip-10-31-12-75.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "f9cbf545a7bd4357ac99f30c9cf5a21a", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.10.210 50220 10.31.12.75 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.10.210 50220 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANE82uVxtVfIhrJuj+Z56Gg9bJlDR+yYhhx40nddz9Kp8spHmHNvFWFn+e7QAGUfCb+6Bn9hYTJ01jfHaC5ohG1MVnAda9CG+H9c6PQ+gHSYPJWdLw/SGnwyt5N6bir8gJvf5eqrXg0FFbm02VDZJAH4ww7gBr9WPGM4PY1Xusd3AAAAFQDMFufFoivmmHcgCnY/kt+ytzmOaQAAAIBcsxi9kZhiwOrK7psJGYdQI1cVUnaqKMfAmHz2cWmhI4jYrMQOwdMF3XQMyXgmrePEWFnuov5VbepbLu43oTrQo18/5uhe6kek0DuOeKivfAx8E4a6lh3OiSNw8mu5dYVcLv+bd4Kj97aZb9Gc715QJAj3ImLk7gMK0nFbaUkdZAAAAIA7wufmEs3LK2y8ttz87wJ4frWgcvNvSRJjeZACpPTicryWGrcOtjvdBeYguJ9vlncJisC4nPK3GYKg7yxbmiWL5TPmvQTT6fsy7cLlKkmIbtui/icHcNPTfBXqvJa3ynXTEfNrbid/WOzdTSO0utdrr4LeOgfnqsuif0W/n1CZ7A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCnzq1r0NFAn93EqXQx38G6hyHP8YT9vzeAiqkNxeexjxvgmckt0NM6KskMEaUcog0Gj/4V2FN7mOfj0Gbefb5fespgQ1eR63jDKf/JPDIQ3nN0RnnM77fe7kBba4QXYVDpY7zYR2LorYUWCCWu9hRdsPJzsf22DmXyyZ96gipq1Rg0VnVbyg7oorYufErgnBCXQtgE9Ffr9z2J40csh9GmrmoV5X8JZoSUWePkZXOackxaNHMOBPI5rzUnEjaTP5xZpIc6YAHBQKY0YRocgPk4Bdyku9zRtEeumHFPIvJCfr2mPjdStySjjtKdbJqbKfma3xtz0jVbVoTfltqc6+h5y99ObN99DAe1mmhzp6DrddXYqxdnhd9DaKzrbLE3uRGmSBa2rjmreEjaI+RKLtUwgeWHjgfJQG2OPki9a2VcHfLhBaH0TZhuW7CTEycsVb5YIgqDNdlnV3KDx/cWFHF9q9elURUwgyrD28mszD7cnZEMBmsfdPa3heOOGmZaqkM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNjw/e9P8QOb520cgNDhZpEn2NWDALjpSK9lpaLEOJsey1fh+RV1Bkt5jHN+4WHIvyqRDGP8roN+DvPf+K8h9oI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMHfdyzKk5Ao/6nDckScEFXouLZL8ZutL+VmnNfL8fQj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "16", "minute": "10", "second": "36", "epoch": "1726776636", "epoch_int": "1726776636", "date": "2024-09-19", "time": "16:10:36", "iso8601_micro": "2024-09-19T20:10:36.779207Z", "iso8601": "2024-09-19T20:10:36Z", "iso8601_basic": "20240919T161036779207", "iso8601_basic_short": "20240919T161036", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.12.75 closed. 9197 1726776636.80834: done with _execute_module (ansible.legacy.setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9197 1726776636.80858: reboot: distribution: {'name': 'centos', 'version': '8', 'family': 'redhat'} 9197 1726776636.80869: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9197 1726776636.80873: _low_level_execute_command(): starting 9197 1726776636.80879: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9197 1726776636.84734: stdout chunk (state=2): >>>ac521673-7fcd-458c-a467-85be3447b4e6 <<< 9197 1726776636.84743: stderr chunk (state=2): >>>Shared connection to 10.31.12.75 closed. <<< 9197 1726776636.84753: stdout chunk (state=3): >>><<< 9197 1726776636.84757: stderr chunk (state=3): >>><<< 9197 1726776636.84770: _low_level_execute_command() done: rc=0, stdout=ac521673-7fcd-458c-a467-85be3447b4e6 , stderr=Shared connection to 10.31.12.75 closed. 9197 1726776636.84777: reboot: last boot time: ac521673-7fcd-458c-a467-85be3447b4e6 9197 1726776636.84794: reboot: connect_timeout connection option has not been set 9197 1726776636.84804: reboot: running find module looking in ['/sbin', '/bin', '/usr/sbin', '/usr/bin', '/usr/local/sbin'] to get path for "shutdown" 9197 1726776636.84823: variable 'ansible_module_compression' from source: unknown 9197 1726776636.84870: ANSIBALLZ: Using generic lock for ansible.legacy.find 9197 1726776636.84876: ANSIBALLZ: Acquiring lock 9197 1726776636.84879: ANSIBALLZ: Lock acquired: 140571206407024 9197 1726776636.84883: ANSIBALLZ: Creating module 9197 1726776636.98876: ANSIBALLZ: Writing module into payload 9197 1726776636.99011: ANSIBALLZ: Writing module 9197 1726776636.99036: ANSIBALLZ: Renaming module 9197 1726776636.99044: ANSIBALLZ: Done creating module 9197 1726776636.99057: variable 'ansible_facts' from source: unknown 9197 1726776636.99138: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_find.py 9197 1726776637.00603: Sending initial data 9197 1726776637.00610: Sent initial data (150 bytes) 9197 1726776637.03338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpfrhxu9id /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_find.py <<< 9197 1726776637.05392: stderr chunk (state=3): >>><<< 9197 1726776637.05399: stdout chunk (state=3): >>><<< 9197 1726776637.05417: done transferring module to remote 9197 1726776637.05426: _low_level_execute_command(): starting 9197 1726776637.05432: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/ /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_find.py && sleep 0' 9197 1726776637.08050: stderr chunk (state=2): >>><<< 9197 1726776637.08058: stdout chunk (state=2): >>><<< 9197 1726776637.08075: _low_level_execute_command() done: rc=0, stdout=, stderr= 9197 1726776637.08080: _low_level_execute_command(): starting 9197 1726776637.08085: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/AnsiballZ_find.py && sleep 0' 9197 1726776637.48169: stdout chunk (state=2): >>> {"files": [{"path": "/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726776637.2389402, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}, {"path": "/usr/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726776637.2389402, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}], "changed": false, "msg": "All paths examined", "matched": 2, "examined": 2640, "skipped_paths": {}, "invocation": {"module_args": {"paths": ["/sbin", "/bin", "/usr/sbin", "/usr/bin", "/usr/local/sbin"], "patterns": ["shutdown"], "file_type": "any", "read_whole_file": false, "age_stamp": "mtime", "recurse": false, "hidden": false, "follow": false, "get_checksum": false, "use_regex": false, "exact_mode": true, "excludes": null, "contains": null, "age": null, "size": null, "depth": null, "mode": null}}} <<< 9197 1726776637.49304: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9197 1726776637.49351: stderr chunk (state=3): >>><<< 9197 1726776637.49358: stdout chunk (state=3): >>><<< 9197 1726776637.49373: _low_level_execute_command() done: rc=0, stdout= {"files": [{"path": "/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726776637.2389402, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}, {"path": "/usr/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726776637.2389402, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}], "changed": false, "msg": "All paths examined", "matched": 2, "examined": 2640, "skipped_paths": {}, "invocation": {"module_args": {"paths": ["/sbin", "/bin", "/usr/sbin", "/usr/bin", "/usr/local/sbin"], "patterns": ["shutdown"], "file_type": "any", "read_whole_file": false, "age_stamp": "mtime", "recurse": false, "hidden": false, "follow": false, "get_checksum": false, "use_regex": false, "exact_mode": true, "excludes": null, "contains": null, "age": null, "size": null, "depth": null, "mode": null}}} , stderr=Shared connection to 10.31.12.75 closed. 9197 1726776637.49419: done with _execute_module (ansible.legacy.find, {'paths': ['/sbin', '/bin', '/usr/sbin', '/usr/bin', '/usr/local/sbin'], 'patterns': ['shutdown'], 'file_type': 'any', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.find', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9197 1726776637.49433: reboot: rebooting server with command '/sbin/shutdown -r 0 "Reboot initiated by Ansible"' 9197 1726776637.49437: _low_level_execute_command(): starting 9197 1726776637.49442: _low_level_execute_command(): executing: /bin/sh -c '/sbin/shutdown -r 0 "Reboot initiated by Ansible" && sleep 0' 9197 1726776637.53079: stdout chunk (state=2): >>>Shutdown scheduled for Thu 2024-09-19 16:10:37 EDT, use 'shutdown -c' to cancel. <<< 9197 1726776637.53504: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9197 1726776637.53514: stdout chunk (state=3): >>><<< 9197 1726776637.53525: stderr chunk (state=3): >>><<< 9197 1726776637.53540: _low_level_execute_command() done: rc=0, stdout=Shutdown scheduled for Thu 2024-09-19 16:10:37 EDT, use 'shutdown -c' to cancel. , stderr=Shared connection to 10.31.12.75 closed. 9197 1726776637.53556: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9197 1726776637.53561: _low_level_execute_command(): starting 9197 1726776637.53567: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9197 1726776637.58279: stdout chunk (state=2): >>>ac521673-7fcd-458c-a467-85be3447b4e6 <<< 9197 1726776637.58703: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9197 1726776637.58712: stdout chunk (state=3): >>><<< 9197 1726776637.58722: stderr chunk (state=3): >>><<< 9197 1726776637.58737: _low_level_execute_command() done: rc=0, stdout=ac521673-7fcd-458c-a467-85be3447b4e6 , stderr=Shared connection to 10.31.12.75 closed. 9197 1726776637.58745: reboot: last boot time: ac521673-7fcd-458c-a467-85be3447b4e6 9197 1726776637.58753: reboot: last boot time check fail 'boot time has not changed', retrying in 1.7510 seconds... 9197 1726776639.33870: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9197 1726776639.33881: _low_level_execute_command(): starting 9197 1726776639.33887: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9197 1726776649.35403: stderr chunk (state=2): >>>ssh: connect to host 10.31.12.75 port 22: Connection timed out <<< 9197 1726776649.35418: stdout chunk (state=2): >>><<< 9197 1726776649.35432: stderr chunk (state=3): >>><<< 9197 1726776649.36064: reboot: last boot time check fail 'Failed to connect to the host via ssh: ssh: connect to host 10.31.12.75 port 22: Connection timed out', retrying in 2.0280 seconds... 9197 1726776651.38887: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9197 1726776651.38895: _low_level_execute_command(): starting 9197 1726776651.38901: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9197 1726776658.47934: stderr chunk (state=2): >>>ssh: connect to host 10.31.12.75 port 22: Connection refused <<< 9197 1726776658.47946: stdout chunk (state=2): >>><<< 9197 1726776658.47958: stderr chunk (state=3): >>><<< 9197 1726776658.49512: reboot: last boot time check fail 'Failed to connect to the host via ssh: ssh: connect to host 10.31.12.75 port 22: Connection refused', retrying in 4.3540 seconds... 9197 1726776662.84934: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9197 1726776662.84945: _low_level_execute_command(): starting 9197 1726776662.84952: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9197 1726776663.22736: stdout chunk (state=2): >>>fabaa6a3-57ec-405c-a52e-56993c7a36c3 <<< 9197 1726776663.23200: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9197 1726776663.23261: stderr chunk (state=3): >>><<< 9197 1726776663.23272: stdout chunk (state=3): >>><<< 9197 1726776663.23285: _low_level_execute_command() done: rc=0, stdout=fabaa6a3-57ec-405c-a52e-56993c7a36c3 , stderr=Shared connection to 10.31.12.75 closed. 9197 1726776663.23289: reboot: last boot time: fabaa6a3-57ec-405c-a52e-56993c7a36c3 9197 1726776663.23292: reboot: last boot time check success 9197 1726776663.23300: reboot: attempting post-reboot test command 'tuned-adm active' 9197 1726776663.23302: _low_level_execute_command(): starting 9197 1726776663.23305: _low_level_execute_command(): executing: /bin/sh -c 'tuned-adm active && sleep 0' 9197 1726776663.36903: stdout chunk (state=2): >>>Current active profile: virtual-guest kernel_settings <<< 9197 1726776663.38394: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 9197 1726776663.38441: stderr chunk (state=3): >>><<< 9197 1726776663.38448: stdout chunk (state=3): >>><<< 9197 1726776663.38462: _low_level_execute_command() done: rc=0, stdout=Current active profile: virtual-guest kernel_settings , stderr=Shared connection to 10.31.12.75 closed. 9197 1726776663.38470: reboot: post-reboot test command success 9197 1726776663.38480: _low_level_execute_command(): starting 9197 1726776663.38485: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776636.3341432-9197-12738887004368/ > /dev/null 2>&1 && sleep 0' 9197 1726776663.41210: stderr chunk (state=2): >>><<< 9197 1726776663.41219: stdout chunk (state=2): >>><<< 9197 1726776663.41235: _low_level_execute_command() done: rc=0, stdout=, stderr= 9197 1726776663.41240: handler run complete 9197 1726776663.41252: attempt loop complete, returning result 9197 1726776663.41256: _execute() done 9197 1726776663.41259: dumping result to json 9197 1726776663.41262: done dumping result, returning 9197 1726776663.41269: done running TaskExecutor() for managed_node2/TASK: Reboot the machine - see if settings persist after reboot [120fa90a-8a95-cec2-986e-000000000015] 9197 1726776663.41275: sending task result for task 120fa90a-8a95-cec2-986e-000000000015 9197 1726776663.41298: done sending task result for task 120fa90a-8a95-cec2-986e-000000000015 9197 1726776663.41301: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "elapsed": 25, "rebooted": true } 8218 1726776663.41395: no more pending results, returning what we have 8218 1726776663.41398: results queue empty 8218 1726776663.41398: checking for any_errors_fatal 8218 1726776663.41403: done checking for any_errors_fatal 8218 1726776663.41403: checking for max_fail_percentage 8218 1726776663.41404: done checking for max_fail_percentage 8218 1726776663.41405: checking to see if all hosts have failed and the running result is not ok 8218 1726776663.41406: done checking to see if all hosts have failed 8218 1726776663.41406: getting the remaining hosts for this loop 8218 1726776663.41407: done getting the remaining hosts for this loop 8218 1726776663.41410: getting the next task for host managed_node2 8218 1726776663.41413: done getting next task for host managed_node2 8218 1726776663.41415: ^ task is: TASK: Check sysctl after reboot 8218 1726776663.41416: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776663.41419: getting variables 8218 1726776663.41419: in VariableManager get_vars() 8218 1726776663.41449: Calling all_inventory to load vars for managed_node2 8218 1726776663.41452: Calling groups_inventory to load vars for managed_node2 8218 1726776663.41454: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776663.41462: Calling all_plugins_play to load vars for managed_node2 8218 1726776663.41464: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776663.41466: Calling groups_plugins_play to load vars for managed_node2 8218 1726776663.41608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776663.41759: done with get_vars() 8218 1726776663.41766: done getting variables 8218 1726776663.41806: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:99 Thursday 19 September 2024 16:11:03 -0400 (0:00:27.171) 0:00:49.248 **** 8218 1726776663.41822: entering _queue_task() for managed_node2/shell 8218 1726776663.41977: worker is 1 (out of 1 available) 8218 1726776663.41992: exiting _queue_task() for managed_node2/shell 8218 1726776663.42004: done queuing things up, now waiting for results queue to drain 8218 1726776663.42006: waiting for pending results... 10313 1726776663.42123: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10313 1726776663.42215: in run() - task 120fa90a-8a95-cec2-986e-000000000016 10313 1726776663.42232: variable 'ansible_search_path' from source: unknown 10313 1726776663.42261: calling self._execute() 10313 1726776663.42320: variable 'ansible_host' from source: host vars for 'managed_node2' 10313 1726776663.42330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10313 1726776663.42339: variable 'omit' from source: magic vars 10313 1726776663.42409: variable 'omit' from source: magic vars 10313 1726776663.42434: variable 'omit' from source: magic vars 10313 1726776663.42459: variable 'omit' from source: magic vars 10313 1726776663.42490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10313 1726776663.42516: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10313 1726776663.42536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10313 1726776663.42551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10313 1726776663.42563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10313 1726776663.42588: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10313 1726776663.42594: variable 'ansible_host' from source: host vars for 'managed_node2' 10313 1726776663.42599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10313 1726776663.42665: Set connection var ansible_connection to ssh 10313 1726776663.42674: Set connection var ansible_pipelining to False 10313 1726776663.42680: Set connection var ansible_timeout to 10 10313 1726776663.42687: Set connection var ansible_module_compression to ZIP_DEFLATED 10313 1726776663.42693: Set connection var ansible_shell_type to sh 10313 1726776663.42698: Set connection var ansible_shell_executable to /bin/sh 10313 1726776663.42714: variable 'ansible_shell_executable' from source: unknown 10313 1726776663.42718: variable 'ansible_connection' from source: unknown 10313 1726776663.42721: variable 'ansible_module_compression' from source: unknown 10313 1726776663.42724: variable 'ansible_shell_type' from source: unknown 10313 1726776663.42728: variable 'ansible_shell_executable' from source: unknown 10313 1726776663.42732: variable 'ansible_host' from source: host vars for 'managed_node2' 10313 1726776663.42736: variable 'ansible_pipelining' from source: unknown 10313 1726776663.42740: variable 'ansible_timeout' from source: unknown 10313 1726776663.42744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10313 1726776663.42832: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10313 1726776663.42843: variable 'omit' from source: magic vars 10313 1726776663.42849: starting attempt loop 10313 1726776663.42853: running the handler 10313 1726776663.42861: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10313 1726776663.42876: _low_level_execute_command(): starting 10313 1726776663.42884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10313 1726776663.45189: stdout chunk (state=2): >>>/root <<< 10313 1726776663.45304: stderr chunk (state=3): >>><<< 10313 1726776663.45311: stdout chunk (state=3): >>><<< 10313 1726776663.45326: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10313 1726776663.45342: _low_level_execute_command(): starting 10313 1726776663.45348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779 `" && echo ansible-tmp-1726776663.4533727-10313-95960969137779="` echo /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779 `" ) && sleep 0' 10313 1726776663.47897: stdout chunk (state=2): >>>ansible-tmp-1726776663.4533727-10313-95960969137779=/root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779 <<< 10313 1726776663.48027: stderr chunk (state=3): >>><<< 10313 1726776663.48036: stdout chunk (state=3): >>><<< 10313 1726776663.48052: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776663.4533727-10313-95960969137779=/root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779 , stderr= 10313 1726776663.48076: variable 'ansible_module_compression' from source: unknown 10313 1726776663.48119: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10313 1726776663.48154: variable 'ansible_facts' from source: unknown 10313 1726776663.48227: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779/AnsiballZ_command.py 10313 1726776663.48327: Sending initial data 10313 1726776663.48335: Sent initial data (154 bytes) 10313 1726776663.51118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp4qnz5tqc /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779/AnsiballZ_command.py <<< 10313 1726776663.52177: stderr chunk (state=3): >>><<< 10313 1726776663.52185: stdout chunk (state=3): >>><<< 10313 1726776663.52205: done transferring module to remote 10313 1726776663.52216: _low_level_execute_command(): starting 10313 1726776663.52221: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779/ /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779/AnsiballZ_command.py && sleep 0' 10313 1726776663.54595: stderr chunk (state=2): >>><<< 10313 1726776663.54605: stdout chunk (state=2): >>><<< 10313 1726776663.54620: _low_level_execute_command() done: rc=0, stdout=, stderr= 10313 1726776663.54625: _low_level_execute_command(): starting 10313 1726776663.54634: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779/AnsiballZ_command.py && sleep 0' 10313 1726776663.74679: stdout chunk (state=2): >>> {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 16:11:03.442326", "end": "2024-09-19 16:11:03.455655", "delta": "0:00:00.013329", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10313 1726776663.75995: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10313 1726776663.76009: stdout chunk (state=3): >>><<< 10313 1726776663.76021: stderr chunk (state=3): >>><<< 10313 1726776663.76039: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 16:11:03.442326", "end": "2024-09-19 16:11:03.455655", "delta": "0:00:00.013329", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10313 1726776663.76081: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10313 1726776663.76091: _low_level_execute_command(): starting 10313 1726776663.76099: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776663.4533727-10313-95960969137779/ > /dev/null 2>&1 && sleep 0' 10313 1726776663.78507: stderr chunk (state=2): >>><<< 10313 1726776663.78515: stdout chunk (state=2): >>><<< 10313 1726776663.78532: _low_level_execute_command() done: rc=0, stdout=, stderr= 10313 1726776663.78540: handler run complete 10313 1726776663.78565: Evaluated conditional (False): False 10313 1726776663.78578: attempt loop complete, returning result 10313 1726776663.78582: _execute() done 10313 1726776663.78586: dumping result to json 10313 1726776663.78592: done dumping result, returning 10313 1726776663.78600: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [120fa90a-8a95-cec2-986e-000000000016] 10313 1726776663.78607: sending task result for task 120fa90a-8a95-cec2-986e-000000000016 10313 1726776663.78652: done sending task result for task 120fa90a-8a95-cec2-986e-000000000016 10313 1726776663.78657: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "delta": "0:00:00.013329", "end": "2024-09-19 16:11:03.455655", "rc": 0, "start": "2024-09-19 16:11:03.442326" } STDOUT: 400000 8218 1726776663.78814: no more pending results, returning what we have 8218 1726776663.78817: results queue empty 8218 1726776663.78818: checking for any_errors_fatal 8218 1726776663.78825: done checking for any_errors_fatal 8218 1726776663.78825: checking for max_fail_percentage 8218 1726776663.78827: done checking for max_fail_percentage 8218 1726776663.78828: checking to see if all hosts have failed and the running result is not ok 8218 1726776663.78830: done checking to see if all hosts have failed 8218 1726776663.78833: getting the remaining hosts for this loop 8218 1726776663.78834: done getting the remaining hosts for this loop 8218 1726776663.78837: getting the next task for host managed_node2 8218 1726776663.78842: done getting next task for host managed_node2 8218 1726776663.78844: ^ task is: TASK: Check sysfs after reboot 8218 1726776663.78846: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776663.78849: getting variables 8218 1726776663.78851: in VariableManager get_vars() 8218 1726776663.78884: Calling all_inventory to load vars for managed_node2 8218 1726776663.78886: Calling groups_inventory to load vars for managed_node2 8218 1726776663.78887: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776663.78895: Calling all_plugins_play to load vars for managed_node2 8218 1726776663.78897: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776663.78898: Calling groups_plugins_play to load vars for managed_node2 8218 1726776663.79009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776663.79122: done with get_vars() 8218 1726776663.79136: done getting variables 8218 1726776663.79185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysfs after reboot] ************************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:105 Thursday 19 September 2024 16:11:03 -0400 (0:00:00.373) 0:00:49.622 **** 8218 1726776663.79206: entering _queue_task() for managed_node2/command 8218 1726776663.79363: worker is 1 (out of 1 available) 8218 1726776663.79378: exiting _queue_task() for managed_node2/command 8218 1726776663.79391: done queuing things up, now waiting for results queue to drain 8218 1726776663.79392: waiting for pending results... 10334 1726776663.79514: running TaskExecutor() for managed_node2/TASK: Check sysfs after reboot 10334 1726776663.79602: in run() - task 120fa90a-8a95-cec2-986e-000000000017 10334 1726776663.79617: variable 'ansible_search_path' from source: unknown 10334 1726776663.79648: calling self._execute() 10334 1726776663.79709: variable 'ansible_host' from source: host vars for 'managed_node2' 10334 1726776663.79717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10334 1726776663.79727: variable 'omit' from source: magic vars 10334 1726776663.79799: variable 'omit' from source: magic vars 10334 1726776663.79821: variable 'omit' from source: magic vars 10334 1726776663.79845: variable 'omit' from source: magic vars 10334 1726776663.79876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10334 1726776663.79900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10334 1726776663.79915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10334 1726776663.79926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10334 1726776663.79937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10334 1726776663.79965: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10334 1726776663.79972: variable 'ansible_host' from source: host vars for 'managed_node2' 10334 1726776663.79976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10334 1726776663.80045: Set connection var ansible_connection to ssh 10334 1726776663.80052: Set connection var ansible_pipelining to False 10334 1726776663.80057: Set connection var ansible_timeout to 10 10334 1726776663.80061: Set connection var ansible_module_compression to ZIP_DEFLATED 10334 1726776663.80065: Set connection var ansible_shell_type to sh 10334 1726776663.80067: Set connection var ansible_shell_executable to /bin/sh 10334 1726776663.80082: variable 'ansible_shell_executable' from source: unknown 10334 1726776663.80085: variable 'ansible_connection' from source: unknown 10334 1726776663.80087: variable 'ansible_module_compression' from source: unknown 10334 1726776663.80089: variable 'ansible_shell_type' from source: unknown 10334 1726776663.80091: variable 'ansible_shell_executable' from source: unknown 10334 1726776663.80093: variable 'ansible_host' from source: host vars for 'managed_node2' 10334 1726776663.80097: variable 'ansible_pipelining' from source: unknown 10334 1726776663.80100: variable 'ansible_timeout' from source: unknown 10334 1726776663.80102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10334 1726776663.80191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10334 1726776663.80201: variable 'omit' from source: magic vars 10334 1726776663.80205: starting attempt loop 10334 1726776663.80207: running the handler 10334 1726776663.80218: _low_level_execute_command(): starting 10334 1726776663.80223: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10334 1726776663.82508: stdout chunk (state=2): >>>/root <<< 10334 1726776663.82624: stderr chunk (state=3): >>><<< 10334 1726776663.82632: stdout chunk (state=3): >>><<< 10334 1726776663.82649: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10334 1726776663.82662: _low_level_execute_command(): starting 10334 1726776663.82668: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474 `" && echo ansible-tmp-1726776663.8265734-10334-135910429722474="` echo /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474 `" ) && sleep 0' 10334 1726776663.85198: stdout chunk (state=2): >>>ansible-tmp-1726776663.8265734-10334-135910429722474=/root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474 <<< 10334 1726776663.85321: stderr chunk (state=3): >>><<< 10334 1726776663.85327: stdout chunk (state=3): >>><<< 10334 1726776663.85343: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776663.8265734-10334-135910429722474=/root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474 , stderr= 10334 1726776663.85365: variable 'ansible_module_compression' from source: unknown 10334 1726776663.85407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10334 1726776663.85440: variable 'ansible_facts' from source: unknown 10334 1726776663.85514: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474/AnsiballZ_command.py 10334 1726776663.85606: Sending initial data 10334 1726776663.85613: Sent initial data (155 bytes) 10334 1726776663.88122: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpxbnb7mf3 /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474/AnsiballZ_command.py <<< 10334 1726776663.89194: stderr chunk (state=3): >>><<< 10334 1726776663.89202: stdout chunk (state=3): >>><<< 10334 1726776663.89221: done transferring module to remote 10334 1726776663.89235: _low_level_execute_command(): starting 10334 1726776663.89241: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474/ /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474/AnsiballZ_command.py && sleep 0' 10334 1726776663.91623: stderr chunk (state=2): >>><<< 10334 1726776663.91634: stdout chunk (state=2): >>><<< 10334 1726776663.91648: _low_level_execute_command() done: rc=0, stdout=, stderr= 10334 1726776663.91652: _low_level_execute_command(): starting 10334 1726776663.91657: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474/AnsiballZ_command.py && sleep 0' 10334 1726776664.07275: stdout chunk (state=2): >>> {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:11:03.778475", "end": "2024-09-19 16:11:03.781641", "delta": "0:00:00.003166", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10334 1726776664.08505: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10334 1726776664.08514: stdout chunk (state=3): >>><<< 10334 1726776664.08524: stderr chunk (state=3): >>><<< 10334 1726776664.08538: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:11:03.778475", "end": "2024-09-19 16:11:03.781641", "delta": "0:00:00.003166", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10334 1726776664.08664: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -x 65000 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10334 1726776664.08678: _low_level_execute_command(): starting 10334 1726776664.08684: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776663.8265734-10334-135910429722474/ > /dev/null 2>&1 && sleep 0' 10334 1726776664.11277: stderr chunk (state=2): >>><<< 10334 1726776664.11285: stdout chunk (state=2): >>><<< 10334 1726776664.11301: _low_level_execute_command() done: rc=0, stdout=, stderr= 10334 1726776664.11308: handler run complete 10334 1726776664.11325: Evaluated conditional (False): False 10334 1726776664.11337: attempt loop complete, returning result 10334 1726776664.11341: _execute() done 10334 1726776664.11344: dumping result to json 10334 1726776664.11349: done dumping result, returning 10334 1726776664.11355: done running TaskExecutor() for managed_node2/TASK: Check sysfs after reboot [120fa90a-8a95-cec2-986e-000000000017] 10334 1726776664.11362: sending task result for task 120fa90a-8a95-cec2-986e-000000000017 10334 1726776664.11391: done sending task result for task 120fa90a-8a95-cec2-986e-000000000017 10334 1726776664.11395: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "65000", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003166", "end": "2024-09-19 16:11:03.781641", "rc": 0, "start": "2024-09-19 16:11:03.778475" } STDOUT: 65000 8218 1726776664.11538: no more pending results, returning what we have 8218 1726776664.11541: results queue empty 8218 1726776664.11542: checking for any_errors_fatal 8218 1726776664.11548: done checking for any_errors_fatal 8218 1726776664.11549: checking for max_fail_percentage 8218 1726776664.11551: done checking for max_fail_percentage 8218 1726776664.11551: checking to see if all hosts have failed and the running result is not ok 8218 1726776664.11552: done checking to see if all hosts have failed 8218 1726776664.11553: getting the remaining hosts for this loop 8218 1726776664.11554: done getting the remaining hosts for this loop 8218 1726776664.11557: getting the next task for host managed_node2 8218 1726776664.11562: done getting next task for host managed_node2 8218 1726776664.11564: ^ task is: TASK: Check sysctl after reboot 8218 1726776664.11566: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776664.11570: getting variables 8218 1726776664.11571: in VariableManager get_vars() 8218 1726776664.11604: Calling all_inventory to load vars for managed_node2 8218 1726776664.11606: Calling groups_inventory to load vars for managed_node2 8218 1726776664.11608: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776664.11617: Calling all_plugins_play to load vars for managed_node2 8218 1726776664.11619: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776664.11622: Calling groups_plugins_play to load vars for managed_node2 8218 1726776664.11846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776664.12027: done with get_vars() 8218 1726776664.12042: done getting variables 8218 1726776664.12101: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:109 Thursday 19 September 2024 16:11:04 -0400 (0:00:00.329) 0:00:49.951 **** 8218 1726776664.12128: entering _queue_task() for managed_node2/shell 8218 1726776664.12328: worker is 1 (out of 1 available) 8218 1726776664.12344: exiting _queue_task() for managed_node2/shell 8218 1726776664.12355: done queuing things up, now waiting for results queue to drain 8218 1726776664.12356: waiting for pending results... 10349 1726776664.12569: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10349 1726776664.12680: in run() - task 120fa90a-8a95-cec2-986e-000000000018 10349 1726776664.12700: variable 'ansible_search_path' from source: unknown 10349 1726776664.12734: calling self._execute() 10349 1726776664.12802: variable 'ansible_host' from source: host vars for 'managed_node2' 10349 1726776664.12816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10349 1726776664.12829: variable 'omit' from source: magic vars 10349 1726776664.12903: variable 'omit' from source: magic vars 10349 1726776664.12926: variable 'omit' from source: magic vars 10349 1726776664.12950: variable 'omit' from source: magic vars 10349 1726776664.12981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10349 1726776664.13006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10349 1726776664.13023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10349 1726776664.13040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10349 1726776664.13054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10349 1726776664.13082: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10349 1726776664.13088: variable 'ansible_host' from source: host vars for 'managed_node2' 10349 1726776664.13092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10349 1726776664.13160: Set connection var ansible_connection to ssh 10349 1726776664.13167: Set connection var ansible_pipelining to False 10349 1726776664.13174: Set connection var ansible_timeout to 10 10349 1726776664.13181: Set connection var ansible_module_compression to ZIP_DEFLATED 10349 1726776664.13186: Set connection var ansible_shell_type to sh 10349 1726776664.13192: Set connection var ansible_shell_executable to /bin/sh 10349 1726776664.13209: variable 'ansible_shell_executable' from source: unknown 10349 1726776664.13213: variable 'ansible_connection' from source: unknown 10349 1726776664.13216: variable 'ansible_module_compression' from source: unknown 10349 1726776664.13219: variable 'ansible_shell_type' from source: unknown 10349 1726776664.13222: variable 'ansible_shell_executable' from source: unknown 10349 1726776664.13226: variable 'ansible_host' from source: host vars for 'managed_node2' 10349 1726776664.13231: variable 'ansible_pipelining' from source: unknown 10349 1726776664.13235: variable 'ansible_timeout' from source: unknown 10349 1726776664.13239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10349 1726776664.13331: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10349 1726776664.13343: variable 'omit' from source: magic vars 10349 1726776664.13349: starting attempt loop 10349 1726776664.13353: running the handler 10349 1726776664.13361: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10349 1726776664.13374: _low_level_execute_command(): starting 10349 1726776664.13380: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10349 1726776664.16059: stdout chunk (state=2): >>>/root <<< 10349 1726776664.16195: stderr chunk (state=3): >>><<< 10349 1726776664.16202: stdout chunk (state=3): >>><<< 10349 1726776664.16220: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10349 1726776664.16239: _low_level_execute_command(): starting 10349 1726776664.16247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930 `" && echo ansible-tmp-1726776664.1623049-10349-281330909370930="` echo /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930 `" ) && sleep 0' 10349 1726776664.18919: stdout chunk (state=2): >>>ansible-tmp-1726776664.1623049-10349-281330909370930=/root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930 <<< 10349 1726776664.19050: stderr chunk (state=3): >>><<< 10349 1726776664.19057: stdout chunk (state=3): >>><<< 10349 1726776664.19073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776664.1623049-10349-281330909370930=/root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930 , stderr= 10349 1726776664.19098: variable 'ansible_module_compression' from source: unknown 10349 1726776664.19144: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10349 1726776664.19176: variable 'ansible_facts' from source: unknown 10349 1726776664.19250: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930/AnsiballZ_command.py 10349 1726776664.19349: Sending initial data 10349 1726776664.19356: Sent initial data (155 bytes) 10349 1726776664.21900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpgx2mkyyp /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930/AnsiballZ_command.py <<< 10349 1726776664.22968: stderr chunk (state=3): >>><<< 10349 1726776664.22976: stdout chunk (state=3): >>><<< 10349 1726776664.22996: done transferring module to remote 10349 1726776664.23007: _low_level_execute_command(): starting 10349 1726776664.23012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930/ /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930/AnsiballZ_command.py && sleep 0' 10349 1726776664.25446: stderr chunk (state=2): >>><<< 10349 1726776664.25455: stdout chunk (state=2): >>><<< 10349 1726776664.25468: _low_level_execute_command() done: rc=0, stdout=, stderr= 10349 1726776664.25472: _low_level_execute_command(): starting 10349 1726776664.25477: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930/AnsiballZ_command.py && sleep 0' 10349 1726776664.41837: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 16:11:04.138574", "end": "2024-09-19 16:11:04.145440", "delta": "0:00:00.006866", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10349 1726776664.42822: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10349 1726776664.42838: stdout chunk (state=3): >>><<< 10349 1726776664.42850: stderr chunk (state=3): >>><<< 10349 1726776664.42863: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 16:11:04.138574", "end": "2024-09-19 16:11:04.145440", "delta": "0:00:00.006866", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10349 1726776664.42911: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10349 1726776664.42921: _low_level_execute_command(): starting 10349 1726776664.42929: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776664.1623049-10349-281330909370930/ > /dev/null 2>&1 && sleep 0' 10349 1726776664.45604: stderr chunk (state=2): >>><<< 10349 1726776664.45616: stdout chunk (state=2): >>><<< 10349 1726776664.45636: _low_level_execute_command() done: rc=0, stdout=, stderr= 10349 1726776664.45645: handler run complete 10349 1726776664.45668: Evaluated conditional (False): False 10349 1726776664.45679: attempt loop complete, returning result 10349 1726776664.45683: _execute() done 10349 1726776664.45686: dumping result to json 10349 1726776664.45691: done dumping result, returning 10349 1726776664.45698: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [120fa90a-8a95-cec2-986e-000000000018] 10349 1726776664.45705: sending task result for task 120fa90a-8a95-cec2-986e-000000000018 10349 1726776664.45748: done sending task result for task 120fa90a-8a95-cec2-986e-000000000018 10349 1726776664.45753: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "delta": "0:00:00.006866", "end": "2024-09-19 16:11:04.145440", "rc": 0, "start": "2024-09-19 16:11:04.138574" } 8218 1726776664.46148: no more pending results, returning what we have 8218 1726776664.46152: results queue empty 8218 1726776664.46152: checking for any_errors_fatal 8218 1726776664.46159: done checking for any_errors_fatal 8218 1726776664.46159: checking for max_fail_percentage 8218 1726776664.46161: done checking for max_fail_percentage 8218 1726776664.46162: checking to see if all hosts have failed and the running result is not ok 8218 1726776664.46163: done checking to see if all hosts have failed 8218 1726776664.46163: getting the remaining hosts for this loop 8218 1726776664.46164: done getting the remaining hosts for this loop 8218 1726776664.46168: getting the next task for host managed_node2 8218 1726776664.46173: done getting next task for host managed_node2 8218 1726776664.46176: ^ task is: TASK: Check with tuned verify 8218 1726776664.46177: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776664.46181: getting variables 8218 1726776664.46182: in VariableManager get_vars() 8218 1726776664.46219: Calling all_inventory to load vars for managed_node2 8218 1726776664.46222: Calling groups_inventory to load vars for managed_node2 8218 1726776664.46225: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776664.46239: Calling all_plugins_play to load vars for managed_node2 8218 1726776664.46243: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776664.46246: Calling groups_plugins_play to load vars for managed_node2 8218 1726776664.46410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776664.46599: done with get_vars() 8218 1726776664.46611: done getting variables 8218 1726776664.46662: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check with tuned verify] ************************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:115 Thursday 19 September 2024 16:11:04 -0400 (0:00:00.345) 0:00:50.297 **** 8218 1726776664.46683: entering _queue_task() for managed_node2/command 8218 1726776664.46861: worker is 1 (out of 1 available) 8218 1726776664.46874: exiting _queue_task() for managed_node2/command 8218 1726776664.46885: done queuing things up, now waiting for results queue to drain 8218 1726776664.46886: waiting for pending results... 10363 1726776664.47099: running TaskExecutor() for managed_node2/TASK: Check with tuned verify 10363 1726776664.47198: in run() - task 120fa90a-8a95-cec2-986e-000000000019 10363 1726776664.47214: variable 'ansible_search_path' from source: unknown 10363 1726776664.47251: calling self._execute() 10363 1726776664.47326: variable 'ansible_host' from source: host vars for 'managed_node2' 10363 1726776664.47341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10363 1726776664.47350: variable 'omit' from source: magic vars 10363 1726776664.47450: variable 'omit' from source: magic vars 10363 1726776664.47480: variable 'omit' from source: magic vars 10363 1726776664.47506: variable 'omit' from source: magic vars 10363 1726776664.47553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10363 1726776664.47586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10363 1726776664.47610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10363 1726776664.47627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10363 1726776664.47646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10363 1726776664.47675: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10363 1726776664.47682: variable 'ansible_host' from source: host vars for 'managed_node2' 10363 1726776664.47686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10363 1726776664.47792: Set connection var ansible_connection to ssh 10363 1726776664.47801: Set connection var ansible_pipelining to False 10363 1726776664.47808: Set connection var ansible_timeout to 10 10363 1726776664.47816: Set connection var ansible_module_compression to ZIP_DEFLATED 10363 1726776664.47821: Set connection var ansible_shell_type to sh 10363 1726776664.47826: Set connection var ansible_shell_executable to /bin/sh 10363 1726776664.47851: variable 'ansible_shell_executable' from source: unknown 10363 1726776664.47856: variable 'ansible_connection' from source: unknown 10363 1726776664.47860: variable 'ansible_module_compression' from source: unknown 10363 1726776664.47863: variable 'ansible_shell_type' from source: unknown 10363 1726776664.47866: variable 'ansible_shell_executable' from source: unknown 10363 1726776664.47869: variable 'ansible_host' from source: host vars for 'managed_node2' 10363 1726776664.47873: variable 'ansible_pipelining' from source: unknown 10363 1726776664.47876: variable 'ansible_timeout' from source: unknown 10363 1726776664.47879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10363 1726776664.48088: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10363 1726776664.48103: variable 'omit' from source: magic vars 10363 1726776664.48109: starting attempt loop 10363 1726776664.48113: running the handler 10363 1726776664.48127: _low_level_execute_command(): starting 10363 1726776664.48140: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10363 1726776664.51658: stdout chunk (state=2): >>>/root <<< 10363 1726776664.52120: stderr chunk (state=3): >>><<< 10363 1726776664.52131: stdout chunk (state=3): >>><<< 10363 1726776664.52158: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10363 1726776664.52173: _low_level_execute_command(): starting 10363 1726776664.52180: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678 `" && echo ansible-tmp-1726776664.5216718-10363-166285826428678="` echo /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678 `" ) && sleep 0' 10363 1726776664.55286: stdout chunk (state=2): >>>ansible-tmp-1726776664.5216718-10363-166285826428678=/root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678 <<< 10363 1726776664.55534: stderr chunk (state=3): >>><<< 10363 1726776664.55544: stdout chunk (state=3): >>><<< 10363 1726776664.55562: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776664.5216718-10363-166285826428678=/root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678 , stderr= 10363 1726776664.55592: variable 'ansible_module_compression' from source: unknown 10363 1726776664.55647: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10363 1726776664.55681: variable 'ansible_facts' from source: unknown 10363 1726776664.55784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678/AnsiballZ_command.py 10363 1726776664.56261: Sending initial data 10363 1726776664.56269: Sent initial data (155 bytes) 10363 1726776664.58878: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpbs0gzq78 /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678/AnsiballZ_command.py <<< 10363 1726776664.60738: stderr chunk (state=3): >>><<< 10363 1726776664.60750: stdout chunk (state=3): >>><<< 10363 1726776664.60778: done transferring module to remote 10363 1726776664.60793: _low_level_execute_command(): starting 10363 1726776664.60799: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678/ /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678/AnsiballZ_command.py && sleep 0' 10363 1726776664.63903: stderr chunk (state=2): >>><<< 10363 1726776664.63913: stdout chunk (state=2): >>><<< 10363 1726776664.63932: _low_level_execute_command() done: rc=0, stdout=, stderr= 10363 1726776664.63940: _low_level_execute_command(): starting 10363 1726776664.63946: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678/AnsiballZ_command.py && sleep 0' 10363 1726776664.90817: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:04.557974", "end": "2024-09-19 16:11:04.677082", "delta": "0:00:00.119108", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10363 1726776664.91826: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10363 1726776664.91874: stderr chunk (state=3): >>><<< 10363 1726776664.91886: stdout chunk (state=3): >>><<< 10363 1726776664.91903: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:04.557974", "end": "2024-09-19 16:11:04.677082", "delta": "0:00:00.119108", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10363 1726776664.91948: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10363 1726776664.91959: _low_level_execute_command(): starting 10363 1726776664.91965: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776664.5216718-10363-166285826428678/ > /dev/null 2>&1 && sleep 0' 10363 1726776664.95034: stderr chunk (state=2): >>><<< 10363 1726776664.95044: stdout chunk (state=2): >>><<< 10363 1726776664.95059: _low_level_execute_command() done: rc=0, stdout=, stderr= 10363 1726776664.95067: handler run complete 10363 1726776664.95089: Evaluated conditional (False): False 10363 1726776664.95102: attempt loop complete, returning result 10363 1726776664.95106: _execute() done 10363 1726776664.95109: dumping result to json 10363 1726776664.95114: done dumping result, returning 10363 1726776664.95121: done running TaskExecutor() for managed_node2/TASK: Check with tuned verify [120fa90a-8a95-cec2-986e-000000000019] 10363 1726776664.95127: sending task result for task 120fa90a-8a95-cec2-986e-000000000019 10363 1726776664.95167: done sending task result for task 120fa90a-8a95-cec2-986e-000000000019 10363 1726776664.95171: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.119108", "end": "2024-09-19 16:11:04.677082", "rc": 0, "start": "2024-09-19 16:11:04.557974" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8218 1726776664.95548: no more pending results, returning what we have 8218 1726776664.95551: results queue empty 8218 1726776664.95552: checking for any_errors_fatal 8218 1726776664.95559: done checking for any_errors_fatal 8218 1726776664.95560: checking for max_fail_percentage 8218 1726776664.95561: done checking for max_fail_percentage 8218 1726776664.95562: checking to see if all hosts have failed and the running result is not ok 8218 1726776664.95563: done checking to see if all hosts have failed 8218 1726776664.95564: getting the remaining hosts for this loop 8218 1726776664.95565: done getting the remaining hosts for this loop 8218 1726776664.95568: getting the next task for host managed_node2 8218 1726776664.95575: done getting next task for host managed_node2 8218 1726776664.95577: ^ task is: TASK: Apply role again and remove settings 8218 1726776664.95579: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776664.95582: getting variables 8218 1726776664.95584: in VariableManager get_vars() 8218 1726776664.95618: Calling all_inventory to load vars for managed_node2 8218 1726776664.95621: Calling groups_inventory to load vars for managed_node2 8218 1726776664.95623: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776664.95634: Calling all_plugins_play to load vars for managed_node2 8218 1726776664.95637: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776664.95640: Calling groups_plugins_play to load vars for managed_node2 8218 1726776664.95868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776664.96056: done with get_vars() 8218 1726776664.96066: done getting variables TASK [Apply role again and remove settings] ************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:119 Thursday 19 September 2024 16:11:04 -0400 (0:00:00.494) 0:00:50.792 **** 8218 1726776664.96161: entering _queue_task() for managed_node2/include_role 8218 1726776664.96349: worker is 1 (out of 1 available) 8218 1726776664.96365: exiting _queue_task() for managed_node2/include_role 8218 1726776664.96375: done queuing things up, now waiting for results queue to drain 8218 1726776664.96377: waiting for pending results... 10386 1726776664.96818: running TaskExecutor() for managed_node2/TASK: Apply role again and remove settings 10386 1726776664.96920: in run() - task 120fa90a-8a95-cec2-986e-00000000001a 10386 1726776664.96939: variable 'ansible_search_path' from source: unknown 10386 1726776664.96969: calling self._execute() 10386 1726776664.97037: variable 'ansible_host' from source: host vars for 'managed_node2' 10386 1726776664.97047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10386 1726776664.97054: variable 'omit' from source: magic vars 10386 1726776664.97151: _execute() done 10386 1726776664.97157: dumping result to json 10386 1726776664.97162: done dumping result, returning 10386 1726776664.97167: done running TaskExecutor() for managed_node2/TASK: Apply role again and remove settings [120fa90a-8a95-cec2-986e-00000000001a] 10386 1726776664.97174: sending task result for task 120fa90a-8a95-cec2-986e-00000000001a 10386 1726776664.97211: done sending task result for task 120fa90a-8a95-cec2-986e-00000000001a 10386 1726776664.97215: WORKER PROCESS EXITING 8218 1726776664.97512: no more pending results, returning what we have 8218 1726776664.97517: in VariableManager get_vars() 8218 1726776664.97556: Calling all_inventory to load vars for managed_node2 8218 1726776664.97559: Calling groups_inventory to load vars for managed_node2 8218 1726776664.97561: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776664.97570: Calling all_plugins_play to load vars for managed_node2 8218 1726776664.97573: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776664.97576: Calling groups_plugins_play to load vars for managed_node2 8218 1726776664.97746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776664.97924: done with get_vars() 8218 1726776664.97933: variable 'ansible_search_path' from source: unknown 8218 1726776664.99555: variable 'omit' from source: magic vars 8218 1726776664.99575: variable 'omit' from source: magic vars 8218 1726776664.99588: variable 'omit' from source: magic vars 8218 1726776664.99592: we have included files to process 8218 1726776664.99592: generating all_blocks data 8218 1726776664.99594: done generating all_blocks data 8218 1726776664.99596: processing included file: fedora.linux_system_roles.kernel_settings 8218 1726776664.99618: in VariableManager get_vars() 8218 1726776664.99638: done with get_vars() 8218 1726776664.99668: in VariableManager get_vars() 8218 1726776664.99686: done with get_vars() 8218 1726776664.99727: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8218 1726776664.99792: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8218 1726776664.99837: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8218 1726776664.99914: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8218 1726776665.00426: in VariableManager get_vars() 8218 1726776665.00449: done with get_vars() 8218 1726776665.01684: in VariableManager get_vars() 8218 1726776665.01706: done with get_vars() 8218 1726776665.01866: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8218 1726776665.02666: iterating over new_blocks loaded from include file 8218 1726776665.02668: in VariableManager get_vars() 8218 1726776665.02683: done with get_vars() 8218 1726776665.02684: filtering new block on tags 8218 1726776665.02714: done filtering new block on tags 8218 1726776665.02716: in VariableManager get_vars() 8218 1726776665.02727: done with get_vars() 8218 1726776665.02728: filtering new block on tags 8218 1726776665.02764: done filtering new block on tags 8218 1726776665.02767: in VariableManager get_vars() 8218 1726776665.02779: done with get_vars() 8218 1726776665.02780: filtering new block on tags 8218 1726776665.02907: done filtering new block on tags 8218 1726776665.02910: in VariableManager get_vars() 8218 1726776665.02921: done with get_vars() 8218 1726776665.02923: filtering new block on tags 8218 1726776665.02944: done filtering new block on tags 8218 1726776665.02945: done iterating over new_blocks loaded from include file 8218 1726776665.02946: extending task lists for all hosts with included blocks 8218 1726776665.04426: done extending task lists 8218 1726776665.04427: done processing included files 8218 1726776665.04430: results queue empty 8218 1726776665.04430: checking for any_errors_fatal 8218 1726776665.04434: done checking for any_errors_fatal 8218 1726776665.04435: checking for max_fail_percentage 8218 1726776665.04435: done checking for max_fail_percentage 8218 1726776665.04436: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.04437: done checking to see if all hosts have failed 8218 1726776665.04437: getting the remaining hosts for this loop 8218 1726776665.04438: done getting the remaining hosts for this loop 8218 1726776665.04440: getting the next task for host managed_node2 8218 1726776665.04444: done getting next task for host managed_node2 8218 1726776665.04446: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8218 1726776665.04448: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.04458: getting variables 8218 1726776665.04459: in VariableManager get_vars() 8218 1726776665.04472: Calling all_inventory to load vars for managed_node2 8218 1726776665.04474: Calling groups_inventory to load vars for managed_node2 8218 1726776665.04476: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.04481: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.04483: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.04485: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.04618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.04819: done with get_vars() 8218 1726776665.04830: done getting variables 8218 1726776665.04865: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.087) 0:00:50.879 **** 8218 1726776665.04893: entering _queue_task() for managed_node2/fail 8218 1726776665.05100: worker is 1 (out of 1 available) 8218 1726776665.05114: exiting _queue_task() for managed_node2/fail 8218 1726776665.05125: done queuing things up, now waiting for results queue to drain 8218 1726776665.05126: waiting for pending results... 10388 1726776665.05456: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10388 1726776665.05592: in run() - task 120fa90a-8a95-cec2-986e-0000000002ff 10388 1726776665.05609: variable 'ansible_search_path' from source: unknown 10388 1726776665.05614: variable 'ansible_search_path' from source: unknown 10388 1726776665.05649: calling self._execute() 10388 1726776665.05731: variable 'ansible_host' from source: host vars for 'managed_node2' 10388 1726776665.05742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10388 1726776665.05752: variable 'omit' from source: magic vars 10388 1726776665.06206: variable 'kernel_settings_sysctl' from source: include params 10388 1726776665.06225: variable '__kernel_settings_state_empty' from source: role '' all vars 10388 1726776665.06238: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 10388 1726776665.06590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10388 1726776665.08903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10388 1726776665.08966: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10388 1726776665.09002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10388 1726776665.09038: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10388 1726776665.09064: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10388 1726776665.09136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10388 1726776665.09164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10388 1726776665.09189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10388 1726776665.09224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10388 1726776665.09239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10388 1726776665.09283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10388 1726776665.09302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10388 1726776665.09321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10388 1726776665.09356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10388 1726776665.09368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10388 1726776665.09405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10388 1726776665.09424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10388 1726776665.09448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10388 1726776665.09485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10388 1726776665.09498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10388 1726776665.09849: variable 'kernel_settings_sysctl' from source: include params 10388 1726776665.09922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10388 1726776665.10095: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10388 1726776665.10133: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10388 1726776665.10162: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10388 1726776665.10191: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10388 1726776665.10231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10388 1726776665.10253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10388 1726776665.10275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10388 1726776665.10296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10388 1726776665.10320: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 10388 1726776665.10325: when evaluation is False, skipping this task 10388 1726776665.10330: _execute() done 10388 1726776665.10333: dumping result to json 10388 1726776665.10337: done dumping result, returning 10388 1726776665.10343: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-cec2-986e-0000000002ff] 10388 1726776665.10348: sending task result for task 120fa90a-8a95-cec2-986e-0000000002ff 10388 1726776665.10376: done sending task result for task 120fa90a-8a95-cec2-986e-0000000002ff 10388 1726776665.10380: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8218 1726776665.10796: no more pending results, returning what we have 8218 1726776665.10799: results queue empty 8218 1726776665.10800: checking for any_errors_fatal 8218 1726776665.10802: done checking for any_errors_fatal 8218 1726776665.10802: checking for max_fail_percentage 8218 1726776665.10804: done checking for max_fail_percentage 8218 1726776665.10805: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.10806: done checking to see if all hosts have failed 8218 1726776665.10806: getting the remaining hosts for this loop 8218 1726776665.10807: done getting the remaining hosts for this loop 8218 1726776665.10811: getting the next task for host managed_node2 8218 1726776665.10818: done getting next task for host managed_node2 8218 1726776665.10823: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8218 1726776665.10826: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.10844: getting variables 8218 1726776665.10846: in VariableManager get_vars() 8218 1726776665.10883: Calling all_inventory to load vars for managed_node2 8218 1726776665.10886: Calling groups_inventory to load vars for managed_node2 8218 1726776665.10888: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.10897: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.10900: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.10903: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.11081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.11289: done with get_vars() 8218 1726776665.11302: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.065) 0:00:50.944 **** 8218 1726776665.11399: entering _queue_task() for managed_node2/include_tasks 8218 1726776665.11592: worker is 1 (out of 1 available) 8218 1726776665.11606: exiting _queue_task() for managed_node2/include_tasks 8218 1726776665.11617: done queuing things up, now waiting for results queue to drain 8218 1726776665.11619: waiting for pending results... 10391 1726776665.11836: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10391 1726776665.11968: in run() - task 120fa90a-8a95-cec2-986e-000000000300 10391 1726776665.11986: variable 'ansible_search_path' from source: unknown 10391 1726776665.11991: variable 'ansible_search_path' from source: unknown 10391 1726776665.12023: calling self._execute() 10391 1726776665.12104: variable 'ansible_host' from source: host vars for 'managed_node2' 10391 1726776665.12114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10391 1726776665.12123: variable 'omit' from source: magic vars 10391 1726776665.12221: _execute() done 10391 1726776665.12228: dumping result to json 10391 1726776665.12234: done dumping result, returning 10391 1726776665.12240: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-cec2-986e-000000000300] 10391 1726776665.12248: sending task result for task 120fa90a-8a95-cec2-986e-000000000300 10391 1726776665.12277: done sending task result for task 120fa90a-8a95-cec2-986e-000000000300 10391 1726776665.12281: WORKER PROCESS EXITING 8218 1726776665.12560: no more pending results, returning what we have 8218 1726776665.12564: in VariableManager get_vars() 8218 1726776665.12597: Calling all_inventory to load vars for managed_node2 8218 1726776665.12600: Calling groups_inventory to load vars for managed_node2 8218 1726776665.12602: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.12609: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.12612: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.12614: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.12811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.12986: done with get_vars() 8218 1726776665.12993: variable 'ansible_search_path' from source: unknown 8218 1726776665.12994: variable 'ansible_search_path' from source: unknown 8218 1726776665.13024: we have included files to process 8218 1726776665.13025: generating all_blocks data 8218 1726776665.13026: done generating all_blocks data 8218 1726776665.13034: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776665.13035: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776665.13037: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8218 1726776665.13664: done processing included file 8218 1726776665.13667: iterating over new_blocks loaded from include file 8218 1726776665.13668: in VariableManager get_vars() 8218 1726776665.13690: done with get_vars() 8218 1726776665.13691: filtering new block on tags 8218 1726776665.13717: done filtering new block on tags 8218 1726776665.13719: in VariableManager get_vars() 8218 1726776665.13743: done with get_vars() 8218 1726776665.13745: filtering new block on tags 8218 1726776665.13781: done filtering new block on tags 8218 1726776665.13784: in VariableManager get_vars() 8218 1726776665.13807: done with get_vars() 8218 1726776665.13808: filtering new block on tags 8218 1726776665.13847: done filtering new block on tags 8218 1726776665.13849: in VariableManager get_vars() 8218 1726776665.13871: done with get_vars() 8218 1726776665.13873: filtering new block on tags 8218 1726776665.13896: done filtering new block on tags 8218 1726776665.13898: done iterating over new_blocks loaded from include file 8218 1726776665.13898: extending task lists for all hosts with included blocks 8218 1726776665.14052: done extending task lists 8218 1726776665.14053: done processing included files 8218 1726776665.14054: results queue empty 8218 1726776665.14054: checking for any_errors_fatal 8218 1726776665.14057: done checking for any_errors_fatal 8218 1726776665.14058: checking for max_fail_percentage 8218 1726776665.14059: done checking for max_fail_percentage 8218 1726776665.14059: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.14060: done checking to see if all hosts have failed 8218 1726776665.14060: getting the remaining hosts for this loop 8218 1726776665.14061: done getting the remaining hosts for this loop 8218 1726776665.14064: getting the next task for host managed_node2 8218 1726776665.14068: done getting next task for host managed_node2 8218 1726776665.14070: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8218 1726776665.14073: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.14082: getting variables 8218 1726776665.14083: in VariableManager get_vars() 8218 1726776665.14097: Calling all_inventory to load vars for managed_node2 8218 1726776665.14099: Calling groups_inventory to load vars for managed_node2 8218 1726776665.14101: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.14106: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.14108: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.14110: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.14261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.14440: done with get_vars() 8218 1726776665.14450: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.031) 0:00:50.975 **** 8218 1726776665.14515: entering _queue_task() for managed_node2/setup 8218 1726776665.14695: worker is 1 (out of 1 available) 8218 1726776665.14706: exiting _queue_task() for managed_node2/setup 8218 1726776665.14716: done queuing things up, now waiting for results queue to drain 8218 1726776665.14718: waiting for pending results... 10392 1726776665.14920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10392 1726776665.15068: in run() - task 120fa90a-8a95-cec2-986e-000000000413 10392 1726776665.15084: variable 'ansible_search_path' from source: unknown 10392 1726776665.15088: variable 'ansible_search_path' from source: unknown 10392 1726776665.15120: calling self._execute() 10392 1726776665.15197: variable 'ansible_host' from source: host vars for 'managed_node2' 10392 1726776665.15206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10392 1726776665.15213: variable 'omit' from source: magic vars 10392 1726776665.15702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10392 1726776665.17982: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10392 1726776665.18057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10392 1726776665.18092: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10392 1726776665.18130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10392 1726776665.18156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10392 1726776665.18225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10392 1726776665.18255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10392 1726776665.18280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10392 1726776665.18318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10392 1726776665.18334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10392 1726776665.18385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10392 1726776665.18408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10392 1726776665.18433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10392 1726776665.18471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10392 1726776665.18485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10392 1726776665.18652: variable '__kernel_settings_required_facts' from source: role '' all vars 10392 1726776665.18663: variable 'ansible_facts' from source: unknown 10392 1726776665.18746: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10392 1726776665.18752: when evaluation is False, skipping this task 10392 1726776665.18756: _execute() done 10392 1726776665.18759: dumping result to json 10392 1726776665.18762: done dumping result, returning 10392 1726776665.18768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-cec2-986e-000000000413] 10392 1726776665.18774: sending task result for task 120fa90a-8a95-cec2-986e-000000000413 10392 1726776665.18801: done sending task result for task 120fa90a-8a95-cec2-986e-000000000413 10392 1726776665.18804: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8218 1726776665.19166: no more pending results, returning what we have 8218 1726776665.19169: results queue empty 8218 1726776665.19170: checking for any_errors_fatal 8218 1726776665.19172: done checking for any_errors_fatal 8218 1726776665.19172: checking for max_fail_percentage 8218 1726776665.19174: done checking for max_fail_percentage 8218 1726776665.19175: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.19175: done checking to see if all hosts have failed 8218 1726776665.19176: getting the remaining hosts for this loop 8218 1726776665.19177: done getting the remaining hosts for this loop 8218 1726776665.19180: getting the next task for host managed_node2 8218 1726776665.19190: done getting next task for host managed_node2 8218 1726776665.19193: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8218 1726776665.19197: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.19211: getting variables 8218 1726776665.19213: in VariableManager get_vars() 8218 1726776665.19249: Calling all_inventory to load vars for managed_node2 8218 1726776665.19253: Calling groups_inventory to load vars for managed_node2 8218 1726776665.19255: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.19264: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.19267: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.19270: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.19443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.19646: done with get_vars() 8218 1726776665.19657: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.052) 0:00:51.028 **** 8218 1726776665.19751: entering _queue_task() for managed_node2/stat 8218 1726776665.19935: worker is 1 (out of 1 available) 8218 1726776665.19947: exiting _queue_task() for managed_node2/stat 8218 1726776665.19958: done queuing things up, now waiting for results queue to drain 8218 1726776665.19959: waiting for pending results... 10394 1726776665.20166: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10394 1726776665.20311: in run() - task 120fa90a-8a95-cec2-986e-000000000415 10394 1726776665.20330: variable 'ansible_search_path' from source: unknown 10394 1726776665.20336: variable 'ansible_search_path' from source: unknown 10394 1726776665.20366: calling self._execute() 10394 1726776665.20444: variable 'ansible_host' from source: host vars for 'managed_node2' 10394 1726776665.20453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10394 1726776665.20462: variable 'omit' from source: magic vars 10394 1726776665.20943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10394 1726776665.21167: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10394 1726776665.21208: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10394 1726776665.21241: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10394 1726776665.21273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10394 1726776665.21347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10394 1726776665.21372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10394 1726776665.21398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10394 1726776665.21424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10394 1726776665.21527: variable '__kernel_settings_is_ostree' from source: set_fact 10394 1726776665.21541: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10394 1726776665.21546: when evaluation is False, skipping this task 10394 1726776665.21549: _execute() done 10394 1726776665.21553: dumping result to json 10394 1726776665.21556: done dumping result, returning 10394 1726776665.21561: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-cec2-986e-000000000415] 10394 1726776665.21567: sending task result for task 120fa90a-8a95-cec2-986e-000000000415 10394 1726776665.21594: done sending task result for task 120fa90a-8a95-cec2-986e-000000000415 10394 1726776665.21598: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776665.21889: no more pending results, returning what we have 8218 1726776665.21892: results queue empty 8218 1726776665.21892: checking for any_errors_fatal 8218 1726776665.21898: done checking for any_errors_fatal 8218 1726776665.21899: checking for max_fail_percentage 8218 1726776665.21900: done checking for max_fail_percentage 8218 1726776665.21901: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.21902: done checking to see if all hosts have failed 8218 1726776665.21902: getting the remaining hosts for this loop 8218 1726776665.21904: done getting the remaining hosts for this loop 8218 1726776665.21907: getting the next task for host managed_node2 8218 1726776665.21914: done getting next task for host managed_node2 8218 1726776665.21917: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8218 1726776665.21920: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.21938: getting variables 8218 1726776665.21940: in VariableManager get_vars() 8218 1726776665.21972: Calling all_inventory to load vars for managed_node2 8218 1726776665.22030: Calling groups_inventory to load vars for managed_node2 8218 1726776665.22036: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.22044: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.22047: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.22050: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.22204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.22406: done with get_vars() 8218 1726776665.22415: done getting variables 8218 1726776665.22471: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.027) 0:00:51.055 **** 8218 1726776665.22503: entering _queue_task() for managed_node2/set_fact 8218 1726776665.22680: worker is 1 (out of 1 available) 8218 1726776665.22692: exiting _queue_task() for managed_node2/set_fact 8218 1726776665.22702: done queuing things up, now waiting for results queue to drain 8218 1726776665.22703: waiting for pending results... 10395 1726776665.22909: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10395 1726776665.23052: in run() - task 120fa90a-8a95-cec2-986e-000000000416 10395 1726776665.23068: variable 'ansible_search_path' from source: unknown 10395 1726776665.23073: variable 'ansible_search_path' from source: unknown 10395 1726776665.23103: calling self._execute() 10395 1726776665.23180: variable 'ansible_host' from source: host vars for 'managed_node2' 10395 1726776665.23190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10395 1726776665.23198: variable 'omit' from source: magic vars 10395 1726776665.23613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10395 1726776665.23885: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10395 1726776665.23922: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10395 1726776665.23980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10395 1726776665.24011: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10395 1726776665.24092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10395 1726776665.24121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10395 1726776665.24151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10395 1726776665.24177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10395 1726776665.24291: variable '__kernel_settings_is_ostree' from source: set_fact 10395 1726776665.24304: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10395 1726776665.24309: when evaluation is False, skipping this task 10395 1726776665.24312: _execute() done 10395 1726776665.24316: dumping result to json 10395 1726776665.24319: done dumping result, returning 10395 1726776665.24324: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-cec2-986e-000000000416] 10395 1726776665.24332: sending task result for task 120fa90a-8a95-cec2-986e-000000000416 10395 1726776665.24362: done sending task result for task 120fa90a-8a95-cec2-986e-000000000416 10395 1726776665.24365: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776665.24689: no more pending results, returning what we have 8218 1726776665.24692: results queue empty 8218 1726776665.24693: checking for any_errors_fatal 8218 1726776665.24698: done checking for any_errors_fatal 8218 1726776665.24699: checking for max_fail_percentage 8218 1726776665.24700: done checking for max_fail_percentage 8218 1726776665.24701: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.24702: done checking to see if all hosts have failed 8218 1726776665.24703: getting the remaining hosts for this loop 8218 1726776665.24704: done getting the remaining hosts for this loop 8218 1726776665.24707: getting the next task for host managed_node2 8218 1726776665.24716: done getting next task for host managed_node2 8218 1726776665.24719: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8218 1726776665.24722: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.24741: getting variables 8218 1726776665.24743: in VariableManager get_vars() 8218 1726776665.24776: Calling all_inventory to load vars for managed_node2 8218 1726776665.24779: Calling groups_inventory to load vars for managed_node2 8218 1726776665.24781: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.24790: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.24793: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.24796: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.24966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.25173: done with get_vars() 8218 1726776665.25183: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.027) 0:00:51.083 **** 8218 1726776665.25278: entering _queue_task() for managed_node2/stat 8218 1726776665.25469: worker is 1 (out of 1 available) 8218 1726776665.25483: exiting _queue_task() for managed_node2/stat 8218 1726776665.25495: done queuing things up, now waiting for results queue to drain 8218 1726776665.25497: waiting for pending results... 10397 1726776665.25721: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10397 1726776665.25877: in run() - task 120fa90a-8a95-cec2-986e-000000000418 10397 1726776665.25895: variable 'ansible_search_path' from source: unknown 10397 1726776665.25900: variable 'ansible_search_path' from source: unknown 10397 1726776665.25932: calling self._execute() 10397 1726776665.26011: variable 'ansible_host' from source: host vars for 'managed_node2' 10397 1726776665.26101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10397 1726776665.26112: variable 'omit' from source: magic vars 10397 1726776665.26582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10397 1726776665.26789: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10397 1726776665.26826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10397 1726776665.26858: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10397 1726776665.26885: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10397 1726776665.26953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10397 1726776665.26975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10397 1726776665.26998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10397 1726776665.27019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10397 1726776665.27118: variable '__kernel_settings_is_transactional' from source: set_fact 10397 1726776665.27131: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10397 1726776665.27137: when evaluation is False, skipping this task 10397 1726776665.27140: _execute() done 10397 1726776665.27143: dumping result to json 10397 1726776665.27146: done dumping result, returning 10397 1726776665.27150: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-cec2-986e-000000000418] 10397 1726776665.27156: sending task result for task 120fa90a-8a95-cec2-986e-000000000418 10397 1726776665.27180: done sending task result for task 120fa90a-8a95-cec2-986e-000000000418 10397 1726776665.27183: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776665.27491: no more pending results, returning what we have 8218 1726776665.27494: results queue empty 8218 1726776665.27495: checking for any_errors_fatal 8218 1726776665.27502: done checking for any_errors_fatal 8218 1726776665.27503: checking for max_fail_percentage 8218 1726776665.27504: done checking for max_fail_percentage 8218 1726776665.27505: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.27506: done checking to see if all hosts have failed 8218 1726776665.27506: getting the remaining hosts for this loop 8218 1726776665.27507: done getting the remaining hosts for this loop 8218 1726776665.27511: getting the next task for host managed_node2 8218 1726776665.27517: done getting next task for host managed_node2 8218 1726776665.27521: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8218 1726776665.27524: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.27543: getting variables 8218 1726776665.27545: in VariableManager get_vars() 8218 1726776665.27632: Calling all_inventory to load vars for managed_node2 8218 1726776665.27637: Calling groups_inventory to load vars for managed_node2 8218 1726776665.27640: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.27648: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.27651: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.27654: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.27807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.28003: done with get_vars() 8218 1726776665.28012: done getting variables 8218 1726776665.28071: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.028) 0:00:51.111 **** 8218 1726776665.28103: entering _queue_task() for managed_node2/set_fact 8218 1726776665.28294: worker is 1 (out of 1 available) 8218 1726776665.28307: exiting _queue_task() for managed_node2/set_fact 8218 1726776665.28319: done queuing things up, now waiting for results queue to drain 8218 1726776665.28320: waiting for pending results... 10398 1726776665.28575: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10398 1726776665.28725: in run() - task 120fa90a-8a95-cec2-986e-000000000419 10398 1726776665.28747: variable 'ansible_search_path' from source: unknown 10398 1726776665.28752: variable 'ansible_search_path' from source: unknown 10398 1726776665.28785: calling self._execute() 10398 1726776665.28867: variable 'ansible_host' from source: host vars for 'managed_node2' 10398 1726776665.28876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10398 1726776665.28885: variable 'omit' from source: magic vars 10398 1726776665.29328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10398 1726776665.29598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10398 1726776665.29673: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10398 1726776665.29707: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10398 1726776665.29743: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10398 1726776665.29803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10398 1726776665.29820: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10398 1726776665.29842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10398 1726776665.29867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10398 1726776665.29978: variable '__kernel_settings_is_transactional' from source: set_fact 10398 1726776665.29990: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10398 1726776665.29994: when evaluation is False, skipping this task 10398 1726776665.29997: _execute() done 10398 1726776665.30000: dumping result to json 10398 1726776665.30003: done dumping result, returning 10398 1726776665.30009: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-cec2-986e-000000000419] 10398 1726776665.30015: sending task result for task 120fa90a-8a95-cec2-986e-000000000419 10398 1726776665.30046: done sending task result for task 120fa90a-8a95-cec2-986e-000000000419 10398 1726776665.30050: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776665.30370: no more pending results, returning what we have 8218 1726776665.30374: results queue empty 8218 1726776665.30374: checking for any_errors_fatal 8218 1726776665.30381: done checking for any_errors_fatal 8218 1726776665.30382: checking for max_fail_percentage 8218 1726776665.30383: done checking for max_fail_percentage 8218 1726776665.30384: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.30385: done checking to see if all hosts have failed 8218 1726776665.30385: getting the remaining hosts for this loop 8218 1726776665.30386: done getting the remaining hosts for this loop 8218 1726776665.30389: getting the next task for host managed_node2 8218 1726776665.30398: done getting next task for host managed_node2 8218 1726776665.30401: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8218 1726776665.30404: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.30418: getting variables 8218 1726776665.30420: in VariableManager get_vars() 8218 1726776665.30456: Calling all_inventory to load vars for managed_node2 8218 1726776665.30459: Calling groups_inventory to load vars for managed_node2 8218 1726776665.30461: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.30470: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.30473: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.30476: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.30640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.30852: done with get_vars() 8218 1726776665.30864: done getting variables 8218 1726776665.30921: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.028) 0:00:51.140 **** 8218 1726776665.30961: entering _queue_task() for managed_node2/include_vars 8218 1726776665.31163: worker is 1 (out of 1 available) 8218 1726776665.31176: exiting _queue_task() for managed_node2/include_vars 8218 1726776665.31188: done queuing things up, now waiting for results queue to drain 8218 1726776665.31189: waiting for pending results... 10400 1726776665.31411: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10400 1726776665.31572: in run() - task 120fa90a-8a95-cec2-986e-00000000041b 10400 1726776665.31590: variable 'ansible_search_path' from source: unknown 10400 1726776665.31594: variable 'ansible_search_path' from source: unknown 10400 1726776665.31625: calling self._execute() 10400 1726776665.31782: variable 'ansible_host' from source: host vars for 'managed_node2' 10400 1726776665.31793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10400 1726776665.31802: variable 'omit' from source: magic vars 10400 1726776665.31900: variable 'omit' from source: magic vars 10400 1726776665.31968: variable 'omit' from source: magic vars 10400 1726776665.32362: variable 'ffparams' from source: task vars 10400 1726776665.32495: variable 'ansible_facts' from source: unknown 10400 1726776665.32688: variable 'ansible_facts' from source: unknown 10400 1726776665.32817: variable 'ansible_facts' from source: unknown 10400 1726776665.32951: variable 'ansible_facts' from source: unknown 10400 1726776665.33066: variable 'role_path' from source: magic vars 10400 1726776665.33218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10400 1726776665.33463: Loaded config def from plugin (lookup/first_found) 10400 1726776665.33471: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 10400 1726776665.33503: variable 'ansible_search_path' from source: unknown 10400 1726776665.33523: variable 'ansible_search_path' from source: unknown 10400 1726776665.33536: variable 'ansible_search_path' from source: unknown 10400 1726776665.33544: variable 'ansible_search_path' from source: unknown 10400 1726776665.33550: variable 'ansible_search_path' from source: unknown 10400 1726776665.33567: variable 'omit' from source: magic vars 10400 1726776665.33588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10400 1726776665.33610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10400 1726776665.33630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10400 1726776665.33649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10400 1726776665.33661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10400 1726776665.33686: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10400 1726776665.33692: variable 'ansible_host' from source: host vars for 'managed_node2' 10400 1726776665.33696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10400 1726776665.33787: Set connection var ansible_connection to ssh 10400 1726776665.33797: Set connection var ansible_pipelining to False 10400 1726776665.33803: Set connection var ansible_timeout to 10 10400 1726776665.33811: Set connection var ansible_module_compression to ZIP_DEFLATED 10400 1726776665.33817: Set connection var ansible_shell_type to sh 10400 1726776665.33822: Set connection var ansible_shell_executable to /bin/sh 10400 1726776665.33846: variable 'ansible_shell_executable' from source: unknown 10400 1726776665.33851: variable 'ansible_connection' from source: unknown 10400 1726776665.33855: variable 'ansible_module_compression' from source: unknown 10400 1726776665.33858: variable 'ansible_shell_type' from source: unknown 10400 1726776665.33860: variable 'ansible_shell_executable' from source: unknown 10400 1726776665.33863: variable 'ansible_host' from source: host vars for 'managed_node2' 10400 1726776665.33867: variable 'ansible_pipelining' from source: unknown 10400 1726776665.33869: variable 'ansible_timeout' from source: unknown 10400 1726776665.33873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10400 1726776665.33971: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10400 1726776665.33983: variable 'omit' from source: magic vars 10400 1726776665.33989: starting attempt loop 10400 1726776665.33992: running the handler 10400 1726776665.34047: handler run complete 10400 1726776665.34058: attempt loop complete, returning result 10400 1726776665.34062: _execute() done 10400 1726776665.34065: dumping result to json 10400 1726776665.34069: done dumping result, returning 10400 1726776665.34075: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-cec2-986e-00000000041b] 10400 1726776665.34081: sending task result for task 120fa90a-8a95-cec2-986e-00000000041b 10400 1726776665.34111: done sending task result for task 120fa90a-8a95-cec2-986e-00000000041b 10400 1726776665.34114: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8218 1726776665.34643: no more pending results, returning what we have 8218 1726776665.34646: results queue empty 8218 1726776665.34647: checking for any_errors_fatal 8218 1726776665.34651: done checking for any_errors_fatal 8218 1726776665.34652: checking for max_fail_percentage 8218 1726776665.34654: done checking for max_fail_percentage 8218 1726776665.34654: checking to see if all hosts have failed and the running result is not ok 8218 1726776665.34655: done checking to see if all hosts have failed 8218 1726776665.34656: getting the remaining hosts for this loop 8218 1726776665.34657: done getting the remaining hosts for this loop 8218 1726776665.34660: getting the next task for host managed_node2 8218 1726776665.34668: done getting next task for host managed_node2 8218 1726776665.34671: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8218 1726776665.34674: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776665.34685: getting variables 8218 1726776665.34686: in VariableManager get_vars() 8218 1726776665.34715: Calling all_inventory to load vars for managed_node2 8218 1726776665.34718: Calling groups_inventory to load vars for managed_node2 8218 1726776665.34720: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776665.34731: Calling all_plugins_play to load vars for managed_node2 8218 1726776665.34737: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776665.34740: Calling groups_plugins_play to load vars for managed_node2 8218 1726776665.34898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776665.35108: done with get_vars() 8218 1726776665.35120: done getting variables 8218 1726776665.35183: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:11:05 -0400 (0:00:00.042) 0:00:51.182 **** 8218 1726776665.35216: entering _queue_task() for managed_node2/package 8218 1726776665.35418: worker is 1 (out of 1 available) 8218 1726776665.35435: exiting _queue_task() for managed_node2/package 8218 1726776665.35447: done queuing things up, now waiting for results queue to drain 8218 1726776665.35449: waiting for pending results... 10402 1726776665.35722: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10402 1726776665.35865: in run() - task 120fa90a-8a95-cec2-986e-000000000301 10402 1726776665.35884: variable 'ansible_search_path' from source: unknown 10402 1726776665.35888: variable 'ansible_search_path' from source: unknown 10402 1726776665.35921: calling self._execute() 10402 1726776665.36005: variable 'ansible_host' from source: host vars for 'managed_node2' 10402 1726776665.36015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10402 1726776665.36023: variable 'omit' from source: magic vars 10402 1726776665.36125: variable 'omit' from source: magic vars 10402 1726776665.36175: variable 'omit' from source: magic vars 10402 1726776665.36203: variable '__kernel_settings_packages' from source: include_vars 10402 1726776665.36492: variable '__kernel_settings_packages' from source: include_vars 10402 1726776665.36745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10402 1726776665.38894: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10402 1726776665.38967: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10402 1726776665.39007: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10402 1726776665.39046: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10402 1726776665.39073: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10402 1726776665.39423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10402 1726776665.39455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10402 1726776665.39479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10402 1726776665.39516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10402 1726776665.39531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10402 1726776665.39638: variable '__kernel_settings_is_ostree' from source: set_fact 10402 1726776665.39646: variable 'omit' from source: magic vars 10402 1726776665.39677: variable 'omit' from source: magic vars 10402 1726776665.39704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10402 1726776665.39732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10402 1726776665.39755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10402 1726776665.39771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10402 1726776665.39783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10402 1726776665.39811: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10402 1726776665.39817: variable 'ansible_host' from source: host vars for 'managed_node2' 10402 1726776665.39820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10402 1726776665.39962: Set connection var ansible_connection to ssh 10402 1726776665.39972: Set connection var ansible_pipelining to False 10402 1726776665.39979: Set connection var ansible_timeout to 10 10402 1726776665.39987: Set connection var ansible_module_compression to ZIP_DEFLATED 10402 1726776665.39993: Set connection var ansible_shell_type to sh 10402 1726776665.39998: Set connection var ansible_shell_executable to /bin/sh 10402 1726776665.40021: variable 'ansible_shell_executable' from source: unknown 10402 1726776665.40024: variable 'ansible_connection' from source: unknown 10402 1726776665.40026: variable 'ansible_module_compression' from source: unknown 10402 1726776665.40028: variable 'ansible_shell_type' from source: unknown 10402 1726776665.40035: variable 'ansible_shell_executable' from source: unknown 10402 1726776665.40038: variable 'ansible_host' from source: host vars for 'managed_node2' 10402 1726776665.40042: variable 'ansible_pipelining' from source: unknown 10402 1726776665.40045: variable 'ansible_timeout' from source: unknown 10402 1726776665.40048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10402 1726776665.40141: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10402 1726776665.40155: variable 'omit' from source: magic vars 10402 1726776665.40161: starting attempt loop 10402 1726776665.40165: running the handler 10402 1726776665.40263: variable 'ansible_facts' from source: unknown 10402 1726776665.40382: _low_level_execute_command(): starting 10402 1726776665.40391: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10402 1726776665.43036: stdout chunk (state=2): >>>/root <<< 10402 1726776665.43165: stderr chunk (state=3): >>><<< 10402 1726776665.43172: stdout chunk (state=3): >>><<< 10402 1726776665.43192: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10402 1726776665.43208: _low_level_execute_command(): starting 10402 1726776665.43214: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531 `" && echo ansible-tmp-1726776665.43203-10402-35039487682531="` echo /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531 `" ) && sleep 0' 10402 1726776665.46211: stdout chunk (state=2): >>>ansible-tmp-1726776665.43203-10402-35039487682531=/root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531 <<< 10402 1726776665.46356: stderr chunk (state=3): >>><<< 10402 1726776665.46365: stdout chunk (state=3): >>><<< 10402 1726776665.46387: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776665.43203-10402-35039487682531=/root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531 , stderr= 10402 1726776665.46418: variable 'ansible_module_compression' from source: unknown 10402 1726776665.46479: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10402 1726776665.46525: variable 'ansible_facts' from source: unknown 10402 1726776665.46647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531/AnsiballZ_dnf.py 10402 1726776665.47203: Sending initial data 10402 1726776665.47209: Sent initial data (148 bytes) 10402 1726776665.50710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpkxuh4l3h /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531/AnsiballZ_dnf.py <<< 10402 1726776665.52487: stderr chunk (state=3): >>><<< 10402 1726776665.52497: stdout chunk (state=3): >>><<< 10402 1726776665.52520: done transferring module to remote 10402 1726776665.52535: _low_level_execute_command(): starting 10402 1726776665.52542: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531/ /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531/AnsiballZ_dnf.py && sleep 0' 10402 1726776665.55437: stderr chunk (state=2): >>><<< 10402 1726776665.55446: stdout chunk (state=2): >>><<< 10402 1726776665.55463: _low_level_execute_command() done: rc=0, stdout=, stderr= 10402 1726776665.55468: _low_level_execute_command(): starting 10402 1726776665.55473: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531/AnsiballZ_dnf.py && sleep 0' 10402 1726776670.43666: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10402 1726776670.51130: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10402 1726776670.51178: stderr chunk (state=3): >>><<< 10402 1726776670.51186: stdout chunk (state=3): >>><<< 10402 1726776670.51201: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10402 1726776670.51236: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10402 1726776670.51248: _low_level_execute_command(): starting 10402 1726776670.51254: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776665.43203-10402-35039487682531/ > /dev/null 2>&1 && sleep 0' 10402 1726776670.53771: stderr chunk (state=2): >>><<< 10402 1726776670.53781: stdout chunk (state=2): >>><<< 10402 1726776670.53798: _low_level_execute_command() done: rc=0, stdout=, stderr= 10402 1726776670.53806: handler run complete 10402 1726776670.53833: attempt loop complete, returning result 10402 1726776670.53838: _execute() done 10402 1726776670.53841: dumping result to json 10402 1726776670.53849: done dumping result, returning 10402 1726776670.53857: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-cec2-986e-000000000301] 10402 1726776670.53863: sending task result for task 120fa90a-8a95-cec2-986e-000000000301 10402 1726776670.53894: done sending task result for task 120fa90a-8a95-cec2-986e-000000000301 10402 1726776670.53898: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8218 1726776670.54177: no more pending results, returning what we have 8218 1726776670.54180: results queue empty 8218 1726776670.54180: checking for any_errors_fatal 8218 1726776670.54186: done checking for any_errors_fatal 8218 1726776670.54187: checking for max_fail_percentage 8218 1726776670.54188: done checking for max_fail_percentage 8218 1726776670.54188: checking to see if all hosts have failed and the running result is not ok 8218 1726776670.54189: done checking to see if all hosts have failed 8218 1726776670.54189: getting the remaining hosts for this loop 8218 1726776670.54190: done getting the remaining hosts for this loop 8218 1726776670.54193: getting the next task for host managed_node2 8218 1726776670.54200: done getting next task for host managed_node2 8218 1726776670.54203: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8218 1726776670.54204: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776670.54212: getting variables 8218 1726776670.54213: in VariableManager get_vars() 8218 1726776670.54246: Calling all_inventory to load vars for managed_node2 8218 1726776670.54248: Calling groups_inventory to load vars for managed_node2 8218 1726776670.54249: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776670.54256: Calling all_plugins_play to load vars for managed_node2 8218 1726776670.54258: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776670.54260: Calling groups_plugins_play to load vars for managed_node2 8218 1726776670.54368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776670.54513: done with get_vars() 8218 1726776670.54520: done getting variables 8218 1726776670.54568: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:11:10 -0400 (0:00:05.193) 0:00:56.376 **** 8218 1726776670.54591: entering _queue_task() for managed_node2/debug 8218 1726776670.54754: worker is 1 (out of 1 available) 8218 1726776670.54767: exiting _queue_task() for managed_node2/debug 8218 1726776670.54778: done queuing things up, now waiting for results queue to drain 8218 1726776670.54780: waiting for pending results... 10488 1726776670.54900: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10488 1726776670.55011: in run() - task 120fa90a-8a95-cec2-986e-000000000303 10488 1726776670.55030: variable 'ansible_search_path' from source: unknown 10488 1726776670.55034: variable 'ansible_search_path' from source: unknown 10488 1726776670.55062: calling self._execute() 10488 1726776670.55127: variable 'ansible_host' from source: host vars for 'managed_node2' 10488 1726776670.55135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10488 1726776670.55142: variable 'omit' from source: magic vars 10488 1726776670.55478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10488 1726776670.56981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10488 1726776670.57214: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10488 1726776670.57246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10488 1726776670.57272: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10488 1726776670.57293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10488 1726776670.57351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10488 1726776670.57372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10488 1726776670.57392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10488 1726776670.57419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10488 1726776670.57432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10488 1726776670.57507: variable '__kernel_settings_is_transactional' from source: set_fact 10488 1726776670.57523: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10488 1726776670.57527: when evaluation is False, skipping this task 10488 1726776670.57532: _execute() done 10488 1726776670.57536: dumping result to json 10488 1726776670.57540: done dumping result, returning 10488 1726776670.57549: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-000000000303] 10488 1726776670.57555: sending task result for task 120fa90a-8a95-cec2-986e-000000000303 10488 1726776670.57577: done sending task result for task 120fa90a-8a95-cec2-986e-000000000303 10488 1726776670.57579: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8218 1726776670.57847: no more pending results, returning what we have 8218 1726776670.57850: results queue empty 8218 1726776670.57851: checking for any_errors_fatal 8218 1726776670.57857: done checking for any_errors_fatal 8218 1726776670.57857: checking for max_fail_percentage 8218 1726776670.57858: done checking for max_fail_percentage 8218 1726776670.57859: checking to see if all hosts have failed and the running result is not ok 8218 1726776670.57860: done checking to see if all hosts have failed 8218 1726776670.57860: getting the remaining hosts for this loop 8218 1726776670.57861: done getting the remaining hosts for this loop 8218 1726776670.57863: getting the next task for host managed_node2 8218 1726776670.57868: done getting next task for host managed_node2 8218 1726776670.57870: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8218 1726776670.57872: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776670.57886: getting variables 8218 1726776670.57887: in VariableManager get_vars() 8218 1726776670.57913: Calling all_inventory to load vars for managed_node2 8218 1726776670.57914: Calling groups_inventory to load vars for managed_node2 8218 1726776670.57916: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776670.57923: Calling all_plugins_play to load vars for managed_node2 8218 1726776670.57924: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776670.57926: Calling groups_plugins_play to load vars for managed_node2 8218 1726776670.58035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776670.58151: done with get_vars() 8218 1726776670.58160: done getting variables 8218 1726776670.58200: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:11:10 -0400 (0:00:00.036) 0:00:56.412 **** 8218 1726776670.58224: entering _queue_task() for managed_node2/reboot 8218 1726776670.58380: worker is 1 (out of 1 available) 8218 1726776670.58395: exiting _queue_task() for managed_node2/reboot 8218 1726776670.58406: done queuing things up, now waiting for results queue to drain 8218 1726776670.58408: waiting for pending results... 10489 1726776670.58531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10489 1726776670.58637: in run() - task 120fa90a-8a95-cec2-986e-000000000304 10489 1726776670.58656: variable 'ansible_search_path' from source: unknown 10489 1726776670.58660: variable 'ansible_search_path' from source: unknown 10489 1726776670.58687: calling self._execute() 10489 1726776670.58750: variable 'ansible_host' from source: host vars for 'managed_node2' 10489 1726776670.58759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10489 1726776670.58768: variable 'omit' from source: magic vars 10489 1726776670.59095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10489 1726776670.60899: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10489 1726776670.60963: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10489 1726776670.60999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10489 1726776670.61034: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10489 1726776670.61064: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10489 1726776670.61138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10489 1726776670.61171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10489 1726776670.61195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10489 1726776670.61236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10489 1726776670.61254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10489 1726776670.61351: variable '__kernel_settings_is_transactional' from source: set_fact 10489 1726776670.61369: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10489 1726776670.61373: when evaluation is False, skipping this task 10489 1726776670.61376: _execute() done 10489 1726776670.61379: dumping result to json 10489 1726776670.61382: done dumping result, returning 10489 1726776670.61388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-cec2-986e-000000000304] 10489 1726776670.61394: sending task result for task 120fa90a-8a95-cec2-986e-000000000304 10489 1726776670.61422: done sending task result for task 120fa90a-8a95-cec2-986e-000000000304 10489 1726776670.61425: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776670.61768: no more pending results, returning what we have 8218 1726776670.61770: results queue empty 8218 1726776670.61771: checking for any_errors_fatal 8218 1726776670.61775: done checking for any_errors_fatal 8218 1726776670.61775: checking for max_fail_percentage 8218 1726776670.61776: done checking for max_fail_percentage 8218 1726776670.61777: checking to see if all hosts have failed and the running result is not ok 8218 1726776670.61778: done checking to see if all hosts have failed 8218 1726776670.61778: getting the remaining hosts for this loop 8218 1726776670.61779: done getting the remaining hosts for this loop 8218 1726776670.61781: getting the next task for host managed_node2 8218 1726776670.61788: done getting next task for host managed_node2 8218 1726776670.61791: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8218 1726776670.61793: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776670.61805: getting variables 8218 1726776670.61806: in VariableManager get_vars() 8218 1726776670.61835: Calling all_inventory to load vars for managed_node2 8218 1726776670.61837: Calling groups_inventory to load vars for managed_node2 8218 1726776670.61838: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776670.61846: Calling all_plugins_play to load vars for managed_node2 8218 1726776670.61848: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776670.61849: Calling groups_plugins_play to load vars for managed_node2 8218 1726776670.61956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776670.62110: done with get_vars() 8218 1726776670.62118: done getting variables 8218 1726776670.62161: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:11:10 -0400 (0:00:00.039) 0:00:56.452 **** 8218 1726776670.62183: entering _queue_task() for managed_node2/fail 8218 1726776670.62341: worker is 1 (out of 1 available) 8218 1726776670.62353: exiting _queue_task() for managed_node2/fail 8218 1726776670.62365: done queuing things up, now waiting for results queue to drain 8218 1726776670.62367: waiting for pending results... 10491 1726776670.62491: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10491 1726776670.62604: in run() - task 120fa90a-8a95-cec2-986e-000000000305 10491 1726776670.62620: variable 'ansible_search_path' from source: unknown 10491 1726776670.62624: variable 'ansible_search_path' from source: unknown 10491 1726776670.62654: calling self._execute() 10491 1726776670.62716: variable 'ansible_host' from source: host vars for 'managed_node2' 10491 1726776670.62725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10491 1726776670.62734: variable 'omit' from source: magic vars 10491 1726776670.63066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10491 1726776670.65388: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10491 1726776670.65460: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10491 1726776670.65500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10491 1726776670.65567: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10491 1726776670.65593: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10491 1726776670.65671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10491 1726776670.65698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10491 1726776670.65724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10491 1726776670.65769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10491 1726776670.65784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10491 1726776670.65888: variable '__kernel_settings_is_transactional' from source: set_fact 10491 1726776670.65907: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10491 1726776670.65912: when evaluation is False, skipping this task 10491 1726776670.65916: _execute() done 10491 1726776670.65919: dumping result to json 10491 1726776670.65923: done dumping result, returning 10491 1726776670.65931: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-cec2-986e-000000000305] 10491 1726776670.65938: sending task result for task 120fa90a-8a95-cec2-986e-000000000305 10491 1726776670.65971: done sending task result for task 120fa90a-8a95-cec2-986e-000000000305 10491 1726776670.65975: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776670.66386: no more pending results, returning what we have 8218 1726776670.66389: results queue empty 8218 1726776670.66390: checking for any_errors_fatal 8218 1726776670.66398: done checking for any_errors_fatal 8218 1726776670.66399: checking for max_fail_percentage 8218 1726776670.66401: done checking for max_fail_percentage 8218 1726776670.66401: checking to see if all hosts have failed and the running result is not ok 8218 1726776670.66402: done checking to see if all hosts have failed 8218 1726776670.66403: getting the remaining hosts for this loop 8218 1726776670.66404: done getting the remaining hosts for this loop 8218 1726776670.66408: getting the next task for host managed_node2 8218 1726776670.66416: done getting next task for host managed_node2 8218 1726776670.66420: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8218 1726776670.66422: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776670.66440: getting variables 8218 1726776670.66445: in VariableManager get_vars() 8218 1726776670.66482: Calling all_inventory to load vars for managed_node2 8218 1726776670.66486: Calling groups_inventory to load vars for managed_node2 8218 1726776670.66488: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776670.66497: Calling all_plugins_play to load vars for managed_node2 8218 1726776670.66500: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776670.66503: Calling groups_plugins_play to load vars for managed_node2 8218 1726776670.66681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776670.66888: done with get_vars() 8218 1726776670.66899: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:11:10 -0400 (0:00:00.048) 0:00:56.500 **** 8218 1726776670.66989: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776670.67199: worker is 1 (out of 1 available) 8218 1726776670.67212: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776670.67224: done queuing things up, now waiting for results queue to drain 8218 1726776670.67226: waiting for pending results... 10493 1726776670.67450: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10493 1726776670.67574: in run() - task 120fa90a-8a95-cec2-986e-000000000307 10493 1726776670.67590: variable 'ansible_search_path' from source: unknown 10493 1726776670.67597: variable 'ansible_search_path' from source: unknown 10493 1726776670.67626: calling self._execute() 10493 1726776670.67698: variable 'ansible_host' from source: host vars for 'managed_node2' 10493 1726776670.67707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10493 1726776670.67714: variable 'omit' from source: magic vars 10493 1726776670.67804: variable 'omit' from source: magic vars 10493 1726776670.67848: variable 'omit' from source: magic vars 10493 1726776670.67872: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10493 1726776670.68125: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10493 1726776670.68201: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10493 1726776670.68235: variable 'omit' from source: magic vars 10493 1726776670.68276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10493 1726776670.68312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10493 1726776670.68333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10493 1726776670.68416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10493 1726776670.68432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10493 1726776670.68463: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10493 1726776670.68470: variable 'ansible_host' from source: host vars for 'managed_node2' 10493 1726776670.68474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10493 1726776670.68571: Set connection var ansible_connection to ssh 10493 1726776670.68580: Set connection var ansible_pipelining to False 10493 1726776670.68586: Set connection var ansible_timeout to 10 10493 1726776670.68594: Set connection var ansible_module_compression to ZIP_DEFLATED 10493 1726776670.68601: Set connection var ansible_shell_type to sh 10493 1726776670.68607: Set connection var ansible_shell_executable to /bin/sh 10493 1726776670.68626: variable 'ansible_shell_executable' from source: unknown 10493 1726776670.68633: variable 'ansible_connection' from source: unknown 10493 1726776670.68637: variable 'ansible_module_compression' from source: unknown 10493 1726776670.68640: variable 'ansible_shell_type' from source: unknown 10493 1726776670.68646: variable 'ansible_shell_executable' from source: unknown 10493 1726776670.68649: variable 'ansible_host' from source: host vars for 'managed_node2' 10493 1726776670.68653: variable 'ansible_pipelining' from source: unknown 10493 1726776670.68656: variable 'ansible_timeout' from source: unknown 10493 1726776670.68660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10493 1726776670.68824: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10493 1726776670.68837: variable 'omit' from source: magic vars 10493 1726776670.68846: starting attempt loop 10493 1726776670.68850: running the handler 10493 1726776670.68862: _low_level_execute_command(): starting 10493 1726776670.68871: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10493 1726776670.71801: stdout chunk (state=2): >>>/root <<< 10493 1726776670.71974: stderr chunk (state=3): >>><<< 10493 1726776670.71983: stdout chunk (state=3): >>><<< 10493 1726776670.72006: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10493 1726776670.72022: _low_level_execute_command(): starting 10493 1726776670.72031: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473 `" && echo ansible-tmp-1726776670.7201579-10493-204511564201473="` echo /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473 `" ) && sleep 0' 10493 1726776670.75246: stdout chunk (state=2): >>>ansible-tmp-1726776670.7201579-10493-204511564201473=/root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473 <<< 10493 1726776670.75403: stderr chunk (state=3): >>><<< 10493 1726776670.75413: stdout chunk (state=3): >>><<< 10493 1726776670.75435: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776670.7201579-10493-204511564201473=/root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473 , stderr= 10493 1726776670.75483: variable 'ansible_module_compression' from source: unknown 10493 1726776670.75525: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10493 1726776670.75566: variable 'ansible_facts' from source: unknown 10493 1726776670.75662: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473/AnsiballZ_kernel_settings_get_config.py 10493 1726776670.75972: Sending initial data 10493 1726776670.75984: Sent initial data (174 bytes) 10493 1726776670.78475: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmppgyb_x09 /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473/AnsiballZ_kernel_settings_get_config.py <<< 10493 1726776670.79709: stderr chunk (state=3): >>><<< 10493 1726776670.79716: stdout chunk (state=3): >>><<< 10493 1726776670.79736: done transferring module to remote 10493 1726776670.79748: _low_level_execute_command(): starting 10493 1726776670.79754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473/ /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10493 1726776670.82221: stderr chunk (state=2): >>><<< 10493 1726776670.82230: stdout chunk (state=2): >>><<< 10493 1726776670.82249: _low_level_execute_command() done: rc=0, stdout=, stderr= 10493 1726776670.82255: _low_level_execute_command(): starting 10493 1726776670.82261: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10493 1726776670.98897: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10493 1726776671.00027: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10493 1726776671.00079: stderr chunk (state=3): >>><<< 10493 1726776671.00087: stdout chunk (state=3): >>><<< 10493 1726776671.00102: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 10493 1726776671.00132: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10493 1726776671.00143: _low_level_execute_command(): starting 10493 1726776671.00149: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776670.7201579-10493-204511564201473/ > /dev/null 2>&1 && sleep 0' 10493 1726776671.02626: stderr chunk (state=2): >>><<< 10493 1726776671.02636: stdout chunk (state=2): >>><<< 10493 1726776671.02651: _low_level_execute_command() done: rc=0, stdout=, stderr= 10493 1726776671.02658: handler run complete 10493 1726776671.02675: attempt loop complete, returning result 10493 1726776671.02680: _execute() done 10493 1726776671.02683: dumping result to json 10493 1726776671.02688: done dumping result, returning 10493 1726776671.02695: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-cec2-986e-000000000307] 10493 1726776671.02702: sending task result for task 120fa90a-8a95-cec2-986e-000000000307 10493 1726776671.02739: done sending task result for task 120fa90a-8a95-cec2-986e-000000000307 10493 1726776671.02742: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8218 1726776671.03083: no more pending results, returning what we have 8218 1726776671.03085: results queue empty 8218 1726776671.03086: checking for any_errors_fatal 8218 1726776671.03090: done checking for any_errors_fatal 8218 1726776671.03090: checking for max_fail_percentage 8218 1726776671.03092: done checking for max_fail_percentage 8218 1726776671.03092: checking to see if all hosts have failed and the running result is not ok 8218 1726776671.03093: done checking to see if all hosts have failed 8218 1726776671.03093: getting the remaining hosts for this loop 8218 1726776671.03094: done getting the remaining hosts for this loop 8218 1726776671.03096: getting the next task for host managed_node2 8218 1726776671.03100: done getting next task for host managed_node2 8218 1726776671.03103: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8218 1726776671.03104: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776671.03111: getting variables 8218 1726776671.03112: in VariableManager get_vars() 8218 1726776671.03150: Calling all_inventory to load vars for managed_node2 8218 1726776671.03152: Calling groups_inventory to load vars for managed_node2 8218 1726776671.03154: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776671.03161: Calling all_plugins_play to load vars for managed_node2 8218 1726776671.03163: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776671.03165: Calling groups_plugins_play to load vars for managed_node2 8218 1726776671.03308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776671.03424: done with get_vars() 8218 1726776671.03434: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:11:11 -0400 (0:00:00.365) 0:00:56.865 **** 8218 1726776671.03505: entering _queue_task() for managed_node2/stat 8218 1726776671.03665: worker is 1 (out of 1 available) 8218 1726776671.03679: exiting _queue_task() for managed_node2/stat 8218 1726776671.03690: done queuing things up, now waiting for results queue to drain 8218 1726776671.03692: waiting for pending results... 10513 1726776671.03808: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10513 1726776671.03918: in run() - task 120fa90a-8a95-cec2-986e-000000000308 10513 1726776671.03934: variable 'ansible_search_path' from source: unknown 10513 1726776671.03938: variable 'ansible_search_path' from source: unknown 10513 1726776671.03974: variable '__prof_from_conf' from source: task vars 10513 1726776671.04200: variable '__prof_from_conf' from source: task vars 10513 1726776671.04330: variable '__data' from source: task vars 10513 1726776671.04385: variable '__kernel_settings_register_tuned_main' from source: set_fact 10513 1726776671.04535: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10513 1726776671.04545: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10513 1726776671.04589: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10513 1726776671.04608: variable 'omit' from source: magic vars 10513 1726776671.04687: variable 'ansible_host' from source: host vars for 'managed_node2' 10513 1726776671.04698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10513 1726776671.04706: variable 'omit' from source: magic vars 10513 1726776671.04887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10513 1726776671.07060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10513 1726776671.07118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10513 1726776671.07163: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10513 1726776671.07198: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10513 1726776671.07222: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10513 1726776671.07294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10513 1726776671.07320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10513 1726776671.07342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10513 1726776671.07369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10513 1726776671.07378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10513 1726776671.07445: variable 'item' from source: unknown 10513 1726776671.07456: Evaluated conditional (item | length > 0): False 10513 1726776671.07460: when evaluation is False, skipping this task 10513 1726776671.07478: variable 'item' from source: unknown 10513 1726776671.07519: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 10513 1726776671.07574: variable 'ansible_host' from source: host vars for 'managed_node2' 10513 1726776671.07581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10513 1726776671.07586: variable 'omit' from source: magic vars 10513 1726776671.07697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10513 1726776671.07720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10513 1726776671.07746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10513 1726776671.07785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10513 1726776671.07801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10513 1726776671.07856: variable 'item' from source: unknown 10513 1726776671.07864: Evaluated conditional (item | length > 0): True 10513 1726776671.07871: variable 'omit' from source: magic vars 10513 1726776671.07896: variable 'omit' from source: magic vars 10513 1726776671.07925: variable 'item' from source: unknown 10513 1726776671.07969: variable 'item' from source: unknown 10513 1726776671.07983: variable 'omit' from source: magic vars 10513 1726776671.08001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10513 1726776671.08022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10513 1726776671.08038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10513 1726776671.08052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10513 1726776671.08062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10513 1726776671.08083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10513 1726776671.08087: variable 'ansible_host' from source: host vars for 'managed_node2' 10513 1726776671.08091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10513 1726776671.08158: Set connection var ansible_connection to ssh 10513 1726776671.08166: Set connection var ansible_pipelining to False 10513 1726776671.08172: Set connection var ansible_timeout to 10 10513 1726776671.08179: Set connection var ansible_module_compression to ZIP_DEFLATED 10513 1726776671.08184: Set connection var ansible_shell_type to sh 10513 1726776671.08190: Set connection var ansible_shell_executable to /bin/sh 10513 1726776671.08203: variable 'ansible_shell_executable' from source: unknown 10513 1726776671.08207: variable 'ansible_connection' from source: unknown 10513 1726776671.08210: variable 'ansible_module_compression' from source: unknown 10513 1726776671.08213: variable 'ansible_shell_type' from source: unknown 10513 1726776671.08216: variable 'ansible_shell_executable' from source: unknown 10513 1726776671.08220: variable 'ansible_host' from source: host vars for 'managed_node2' 10513 1726776671.08224: variable 'ansible_pipelining' from source: unknown 10513 1726776671.08227: variable 'ansible_timeout' from source: unknown 10513 1726776671.08233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10513 1726776671.08327: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10513 1726776671.08345: variable 'omit' from source: magic vars 10513 1726776671.08350: starting attempt loop 10513 1726776671.08354: running the handler 10513 1726776671.08364: _low_level_execute_command(): starting 10513 1726776671.08369: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10513 1726776671.10639: stdout chunk (state=2): >>>/root <<< 10513 1726776671.10754: stderr chunk (state=3): >>><<< 10513 1726776671.10760: stdout chunk (state=3): >>><<< 10513 1726776671.10775: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10513 1726776671.10786: _low_level_execute_command(): starting 10513 1726776671.10791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911 `" && echo ansible-tmp-1726776671.1078267-10513-63336010678911="` echo /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911 `" ) && sleep 0' 10513 1726776671.13361: stdout chunk (state=2): >>>ansible-tmp-1726776671.1078267-10513-63336010678911=/root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911 <<< 10513 1726776671.13482: stderr chunk (state=3): >>><<< 10513 1726776671.13488: stdout chunk (state=3): >>><<< 10513 1726776671.13501: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776671.1078267-10513-63336010678911=/root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911 , stderr= 10513 1726776671.13536: variable 'ansible_module_compression' from source: unknown 10513 1726776671.13575: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10513 1726776671.13599: variable 'ansible_facts' from source: unknown 10513 1726776671.13663: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911/AnsiballZ_stat.py 10513 1726776671.13755: Sending initial data 10513 1726776671.13762: Sent initial data (151 bytes) 10513 1726776671.16278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmppfm1205k /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911/AnsiballZ_stat.py <<< 10513 1726776671.17368: stderr chunk (state=3): >>><<< 10513 1726776671.17376: stdout chunk (state=3): >>><<< 10513 1726776671.17397: done transferring module to remote 10513 1726776671.17410: _low_level_execute_command(): starting 10513 1726776671.17416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911/ /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911/AnsiballZ_stat.py && sleep 0' 10513 1726776671.19804: stderr chunk (state=2): >>><<< 10513 1726776671.19812: stdout chunk (state=2): >>><<< 10513 1726776671.19825: _low_level_execute_command() done: rc=0, stdout=, stderr= 10513 1726776671.19830: _low_level_execute_command(): starting 10513 1726776671.19835: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911/AnsiballZ_stat.py && sleep 0' 10513 1726776671.35002: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10513 1726776671.36210: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10513 1726776671.36265: stderr chunk (state=3): >>><<< 10513 1726776671.36272: stdout chunk (state=3): >>><<< 10513 1726776671.36286: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 10513 1726776671.36317: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10513 1726776671.36334: _low_level_execute_command(): starting 10513 1726776671.36344: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776671.1078267-10513-63336010678911/ > /dev/null 2>&1 && sleep 0' 10513 1726776671.38857: stderr chunk (state=2): >>><<< 10513 1726776671.38865: stdout chunk (state=2): >>><<< 10513 1726776671.38878: _low_level_execute_command() done: rc=0, stdout=, stderr= 10513 1726776671.38884: handler run complete 10513 1726776671.38900: attempt loop complete, returning result 10513 1726776671.38916: variable 'item' from source: unknown 10513 1726776671.38978: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10513 1726776671.39069: variable 'ansible_host' from source: host vars for 'managed_node2' 10513 1726776671.39079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10513 1726776671.39088: variable 'omit' from source: magic vars 10513 1726776671.39193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10513 1726776671.39215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10513 1726776671.39236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10513 1726776671.39265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10513 1726776671.39277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10513 1726776671.39346: variable 'item' from source: unknown 10513 1726776671.39356: Evaluated conditional (item | length > 0): True 10513 1726776671.39361: variable 'omit' from source: magic vars 10513 1726776671.39377: variable 'omit' from source: magic vars 10513 1726776671.39416: variable 'item' from source: unknown 10513 1726776671.39480: variable 'item' from source: unknown 10513 1726776671.39496: variable 'omit' from source: magic vars 10513 1726776671.39514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10513 1726776671.39522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10513 1726776671.39528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10513 1726776671.39545: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10513 1726776671.39549: variable 'ansible_host' from source: host vars for 'managed_node2' 10513 1726776671.39553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10513 1726776671.39627: Set connection var ansible_connection to ssh 10513 1726776671.39636: Set connection var ansible_pipelining to False 10513 1726776671.39644: Set connection var ansible_timeout to 10 10513 1726776671.39652: Set connection var ansible_module_compression to ZIP_DEFLATED 10513 1726776671.39657: Set connection var ansible_shell_type to sh 10513 1726776671.39662: Set connection var ansible_shell_executable to /bin/sh 10513 1726776671.39677: variable 'ansible_shell_executable' from source: unknown 10513 1726776671.39681: variable 'ansible_connection' from source: unknown 10513 1726776671.39684: variable 'ansible_module_compression' from source: unknown 10513 1726776671.39686: variable 'ansible_shell_type' from source: unknown 10513 1726776671.39692: variable 'ansible_shell_executable' from source: unknown 10513 1726776671.39698: variable 'ansible_host' from source: host vars for 'managed_node2' 10513 1726776671.39702: variable 'ansible_pipelining' from source: unknown 10513 1726776671.39705: variable 'ansible_timeout' from source: unknown 10513 1726776671.39711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10513 1726776671.39811: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10513 1726776671.39822: variable 'omit' from source: magic vars 10513 1726776671.39827: starting attempt loop 10513 1726776671.39832: running the handler 10513 1726776671.39838: _low_level_execute_command(): starting 10513 1726776671.39842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10513 1726776671.42252: stdout chunk (state=2): >>>/root <<< 10513 1726776671.42445: stderr chunk (state=3): >>><<< 10513 1726776671.42455: stdout chunk (state=3): >>><<< 10513 1726776671.42469: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10513 1726776671.42477: _low_level_execute_command(): starting 10513 1726776671.42481: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784 `" && echo ansible-tmp-1726776671.4247372-10513-90751899914784="` echo /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784 `" ) && sleep 0' 10513 1726776671.45151: stdout chunk (state=2): >>>ansible-tmp-1726776671.4247372-10513-90751899914784=/root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784 <<< 10513 1726776671.45268: stderr chunk (state=3): >>><<< 10513 1726776671.45275: stdout chunk (state=3): >>><<< 10513 1726776671.45290: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776671.4247372-10513-90751899914784=/root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784 , stderr= 10513 1726776671.45320: variable 'ansible_module_compression' from source: unknown 10513 1726776671.45360: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10513 1726776671.45379: variable 'ansible_facts' from source: unknown 10513 1726776671.45434: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784/AnsiballZ_stat.py 10513 1726776671.45523: Sending initial data 10513 1726776671.45532: Sent initial data (151 bytes) 10513 1726776671.48102: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp9a_6xhv5 /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784/AnsiballZ_stat.py <<< 10513 1726776671.49190: stderr chunk (state=3): >>><<< 10513 1726776671.49198: stdout chunk (state=3): >>><<< 10513 1726776671.49216: done transferring module to remote 10513 1726776671.49224: _low_level_execute_command(): starting 10513 1726776671.49230: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784/ /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784/AnsiballZ_stat.py && sleep 0' 10513 1726776671.51665: stderr chunk (state=2): >>><<< 10513 1726776671.51673: stdout chunk (state=2): >>><<< 10513 1726776671.51688: _low_level_execute_command() done: rc=0, stdout=, stderr= 10513 1726776671.51693: _low_level_execute_command(): starting 10513 1726776671.51698: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784/AnsiballZ_stat.py && sleep 0' 10513 1726776671.68644: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10513 1726776671.69896: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10513 1726776671.69939: stderr chunk (state=3): >>><<< 10513 1726776671.69949: stdout chunk (state=3): >>><<< 10513 1726776671.69965: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 10513 1726776671.69999: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10513 1726776671.70008: _low_level_execute_command(): starting 10513 1726776671.70013: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776671.4247372-10513-90751899914784/ > /dev/null 2>&1 && sleep 0' 10513 1726776671.72441: stderr chunk (state=2): >>><<< 10513 1726776671.72449: stdout chunk (state=2): >>><<< 10513 1726776671.72463: _low_level_execute_command() done: rc=0, stdout=, stderr= 10513 1726776671.72469: handler run complete 10513 1726776671.72498: attempt loop complete, returning result 10513 1726776671.72514: variable 'item' from source: unknown 10513 1726776671.72575: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776634.1489303, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726776632.1399238, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726776632.1399238, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10513 1726776671.72617: dumping result to json 10513 1726776671.72627: done dumping result, returning 10513 1726776671.72638: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-cec2-986e-000000000308] 10513 1726776671.72644: sending task result for task 120fa90a-8a95-cec2-986e-000000000308 10513 1726776671.72684: done sending task result for task 120fa90a-8a95-cec2-986e-000000000308 10513 1726776671.72688: WORKER PROCESS EXITING 8218 1726776671.72916: no more pending results, returning what we have 8218 1726776671.72919: results queue empty 8218 1726776671.72920: checking for any_errors_fatal 8218 1726776671.72926: done checking for any_errors_fatal 8218 1726776671.72926: checking for max_fail_percentage 8218 1726776671.72928: done checking for max_fail_percentage 8218 1726776671.72930: checking to see if all hosts have failed and the running result is not ok 8218 1726776671.72931: done checking to see if all hosts have failed 8218 1726776671.72932: getting the remaining hosts for this loop 8218 1726776671.72933: done getting the remaining hosts for this loop 8218 1726776671.72936: getting the next task for host managed_node2 8218 1726776671.72940: done getting next task for host managed_node2 8218 1726776671.72942: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8218 1726776671.72946: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776671.72954: getting variables 8218 1726776671.72955: in VariableManager get_vars() 8218 1726776671.72981: Calling all_inventory to load vars for managed_node2 8218 1726776671.72983: Calling groups_inventory to load vars for managed_node2 8218 1726776671.72984: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776671.72991: Calling all_plugins_play to load vars for managed_node2 8218 1726776671.72993: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776671.72994: Calling groups_plugins_play to load vars for managed_node2 8218 1726776671.73103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776671.73222: done with get_vars() 8218 1726776671.73233: done getting variables 8218 1726776671.73283: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:11:11 -0400 (0:00:00.698) 0:00:57.563 **** 8218 1726776671.73313: entering _queue_task() for managed_node2/set_fact 8218 1726776671.73522: worker is 1 (out of 1 available) 8218 1726776671.73537: exiting _queue_task() for managed_node2/set_fact 8218 1726776671.73552: done queuing things up, now waiting for results queue to drain 8218 1726776671.73553: waiting for pending results... 10544 1726776671.73781: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 10544 1726776671.73918: in run() - task 120fa90a-8a95-cec2-986e-000000000309 10544 1726776671.73938: variable 'ansible_search_path' from source: unknown 10544 1726776671.73942: variable 'ansible_search_path' from source: unknown 10544 1726776671.73975: calling self._execute() 10544 1726776671.74055: variable 'ansible_host' from source: host vars for 'managed_node2' 10544 1726776671.74065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10544 1726776671.74073: variable 'omit' from source: magic vars 10544 1726776671.74474: variable 'omit' from source: magic vars 10544 1726776671.74508: variable 'omit' from source: magic vars 10544 1726776671.74814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10544 1726776671.76274: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10544 1726776671.76325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10544 1726776671.76354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10544 1726776671.76377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10544 1726776671.76396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10544 1726776671.76453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10544 1726776671.76473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10544 1726776671.76495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10544 1726776671.76522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10544 1726776671.76533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10544 1726776671.76565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10544 1726776671.76580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10544 1726776671.76593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10544 1726776671.76619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10544 1726776671.76627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10544 1726776671.76672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10544 1726776671.76689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10544 1726776671.76707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10544 1726776671.76735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10544 1726776671.76748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10544 1726776671.76890: variable '__kernel_settings_find_profile_dirs' from source: set_fact 10544 1726776671.76955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10544 1726776671.77061: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10544 1726776671.77088: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10544 1726776671.77113: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10544 1726776671.77136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10544 1726776671.77168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10544 1726776671.77185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10544 1726776671.77202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10544 1726776671.77219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10544 1726776671.77260: variable 'omit' from source: magic vars 10544 1726776671.77280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10544 1726776671.77299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10544 1726776671.77314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10544 1726776671.77327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10544 1726776671.77338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10544 1726776671.77362: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10544 1726776671.77367: variable 'ansible_host' from source: host vars for 'managed_node2' 10544 1726776671.77371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10544 1726776671.77434: Set connection var ansible_connection to ssh 10544 1726776671.77442: Set connection var ansible_pipelining to False 10544 1726776671.77451: Set connection var ansible_timeout to 10 10544 1726776671.77458: Set connection var ansible_module_compression to ZIP_DEFLATED 10544 1726776671.77463: Set connection var ansible_shell_type to sh 10544 1726776671.77468: Set connection var ansible_shell_executable to /bin/sh 10544 1726776671.77485: variable 'ansible_shell_executable' from source: unknown 10544 1726776671.77489: variable 'ansible_connection' from source: unknown 10544 1726776671.77493: variable 'ansible_module_compression' from source: unknown 10544 1726776671.77496: variable 'ansible_shell_type' from source: unknown 10544 1726776671.77499: variable 'ansible_shell_executable' from source: unknown 10544 1726776671.77502: variable 'ansible_host' from source: host vars for 'managed_node2' 10544 1726776671.77507: variable 'ansible_pipelining' from source: unknown 10544 1726776671.77510: variable 'ansible_timeout' from source: unknown 10544 1726776671.77514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10544 1726776671.77576: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10544 1726776671.77587: variable 'omit' from source: magic vars 10544 1726776671.77593: starting attempt loop 10544 1726776671.77597: running the handler 10544 1726776671.77606: handler run complete 10544 1726776671.77614: attempt loop complete, returning result 10544 1726776671.77617: _execute() done 10544 1726776671.77620: dumping result to json 10544 1726776671.77624: done dumping result, returning 10544 1726776671.77631: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-cec2-986e-000000000309] 10544 1726776671.77637: sending task result for task 120fa90a-8a95-cec2-986e-000000000309 10544 1726776671.77658: done sending task result for task 120fa90a-8a95-cec2-986e-000000000309 10544 1726776671.77662: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8218 1726776671.77778: no more pending results, returning what we have 8218 1726776671.77781: results queue empty 8218 1726776671.77782: checking for any_errors_fatal 8218 1726776671.77789: done checking for any_errors_fatal 8218 1726776671.77790: checking for max_fail_percentage 8218 1726776671.77791: done checking for max_fail_percentage 8218 1726776671.77792: checking to see if all hosts have failed and the running result is not ok 8218 1726776671.77793: done checking to see if all hosts have failed 8218 1726776671.77793: getting the remaining hosts for this loop 8218 1726776671.77794: done getting the remaining hosts for this loop 8218 1726776671.77798: getting the next task for host managed_node2 8218 1726776671.77804: done getting next task for host managed_node2 8218 1726776671.77807: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8218 1726776671.77809: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776671.77819: getting variables 8218 1726776671.77820: in VariableManager get_vars() 8218 1726776671.77854: Calling all_inventory to load vars for managed_node2 8218 1726776671.77857: Calling groups_inventory to load vars for managed_node2 8218 1726776671.77859: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776671.77868: Calling all_plugins_play to load vars for managed_node2 8218 1726776671.77870: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776671.77872: Calling groups_plugins_play to load vars for managed_node2 8218 1726776671.78163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776671.78272: done with get_vars() 8218 1726776671.78279: done getting variables 8218 1726776671.78317: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:11:11 -0400 (0:00:00.050) 0:00:57.613 **** 8218 1726776671.78339: entering _queue_task() for managed_node2/service 8218 1726776671.78489: worker is 1 (out of 1 available) 8218 1726776671.78504: exiting _queue_task() for managed_node2/service 8218 1726776671.78516: done queuing things up, now waiting for results queue to drain 8218 1726776671.78517: waiting for pending results... 10546 1726776671.78637: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 10546 1726776671.78746: in run() - task 120fa90a-8a95-cec2-986e-00000000030a 10546 1726776671.78759: variable 'ansible_search_path' from source: unknown 10546 1726776671.78763: variable 'ansible_search_path' from source: unknown 10546 1726776671.78793: variable '__kernel_settings_services' from source: include_vars 10546 1726776671.79022: variable '__kernel_settings_services' from source: include_vars 10546 1726776671.79082: variable 'omit' from source: magic vars 10546 1726776671.79164: variable 'ansible_host' from source: host vars for 'managed_node2' 10546 1726776671.79175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10546 1726776671.79185: variable 'omit' from source: magic vars 10546 1726776671.79235: variable 'omit' from source: magic vars 10546 1726776671.79267: variable 'omit' from source: magic vars 10546 1726776671.79299: variable 'item' from source: unknown 10546 1726776671.79357: variable 'item' from source: unknown 10546 1726776671.79376: variable 'omit' from source: magic vars 10546 1726776671.79407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10546 1726776671.79433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10546 1726776671.79452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10546 1726776671.79465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10546 1726776671.79475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10546 1726776671.79497: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10546 1726776671.79502: variable 'ansible_host' from source: host vars for 'managed_node2' 10546 1726776671.79507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10546 1726776671.79574: Set connection var ansible_connection to ssh 10546 1726776671.79581: Set connection var ansible_pipelining to False 10546 1726776671.79588: Set connection var ansible_timeout to 10 10546 1726776671.79595: Set connection var ansible_module_compression to ZIP_DEFLATED 10546 1726776671.79600: Set connection var ansible_shell_type to sh 10546 1726776671.79606: Set connection var ansible_shell_executable to /bin/sh 10546 1726776671.79618: variable 'ansible_shell_executable' from source: unknown 10546 1726776671.79621: variable 'ansible_connection' from source: unknown 10546 1726776671.79624: variable 'ansible_module_compression' from source: unknown 10546 1726776671.79625: variable 'ansible_shell_type' from source: unknown 10546 1726776671.79627: variable 'ansible_shell_executable' from source: unknown 10546 1726776671.79640: variable 'ansible_host' from source: host vars for 'managed_node2' 10546 1726776671.79646: variable 'ansible_pipelining' from source: unknown 10546 1726776671.79650: variable 'ansible_timeout' from source: unknown 10546 1726776671.79654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10546 1726776671.79745: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10546 1726776671.79757: variable 'omit' from source: magic vars 10546 1726776671.79762: starting attempt loop 10546 1726776671.79766: running the handler 10546 1726776671.79822: variable 'ansible_facts' from source: unknown 10546 1726776671.79901: _low_level_execute_command(): starting 10546 1726776671.79911: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10546 1726776671.82241: stdout chunk (state=2): >>>/root <<< 10546 1726776671.82358: stderr chunk (state=3): >>><<< 10546 1726776671.82365: stdout chunk (state=3): >>><<< 10546 1726776671.82381: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10546 1726776671.82393: _low_level_execute_command(): starting 10546 1726776671.82399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545 `" && echo ansible-tmp-1726776671.8238766-10546-95016960914545="` echo /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545 `" ) && sleep 0' 10546 1726776671.85017: stdout chunk (state=2): >>>ansible-tmp-1726776671.8238766-10546-95016960914545=/root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545 <<< 10546 1726776671.85147: stderr chunk (state=3): >>><<< 10546 1726776671.85153: stdout chunk (state=3): >>><<< 10546 1726776671.85166: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776671.8238766-10546-95016960914545=/root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545 , stderr= 10546 1726776671.85191: variable 'ansible_module_compression' from source: unknown 10546 1726776671.85237: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10546 1726776671.85285: variable 'ansible_facts' from source: unknown 10546 1726776671.85445: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545/AnsiballZ_systemd.py 10546 1726776671.85570: Sending initial data 10546 1726776671.85578: Sent initial data (154 bytes) 10546 1726776671.88074: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp47w6nfac /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545/AnsiballZ_systemd.py <<< 10546 1726776671.90491: stderr chunk (state=3): >>><<< 10546 1726776671.90500: stdout chunk (state=3): >>><<< 10546 1726776671.90521: done transferring module to remote 10546 1726776671.90533: _low_level_execute_command(): starting 10546 1726776671.90539: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545/ /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545/AnsiballZ_systemd.py && sleep 0' 10546 1726776671.92911: stderr chunk (state=2): >>><<< 10546 1726776671.92918: stdout chunk (state=2): >>><<< 10546 1726776671.92933: _low_level_execute_command() done: rc=0, stdout=, stderr= 10546 1726776671.92938: _low_level_execute_command(): starting 10546 1726776671.92945: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545/AnsiballZ_systemd.py && sleep 0' 10546 1726776672.21407: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18415616", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 10546 1726776672.21467: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10546 1726776672.23129: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10546 1726776672.23171: stderr chunk (state=3): >>><<< 10546 1726776672.23178: stdout chunk (state=3): >>><<< 10546 1726776672.23196: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18415616", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10546 1726776672.23304: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10546 1726776672.23323: _low_level_execute_command(): starting 10546 1726776672.23330: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776671.8238766-10546-95016960914545/ > /dev/null 2>&1 && sleep 0' 10546 1726776672.25726: stderr chunk (state=2): >>><<< 10546 1726776672.25734: stdout chunk (state=2): >>><<< 10546 1726776672.25747: _low_level_execute_command() done: rc=0, stdout=, stderr= 10546 1726776672.25754: handler run complete 10546 1726776672.25789: attempt loop complete, returning result 10546 1726776672.25806: variable 'item' from source: unknown 10546 1726776672.25864: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "659", "MemoryAccounting": "yes", "MemoryCurrent": "18415616", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "WatchdogUSec": "0" } } 10546 1726776672.25957: dumping result to json 10546 1726776672.25975: done dumping result, returning 10546 1726776672.25983: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-cec2-986e-00000000030a] 10546 1726776672.25990: sending task result for task 120fa90a-8a95-cec2-986e-00000000030a 10546 1726776672.26093: done sending task result for task 120fa90a-8a95-cec2-986e-00000000030a 10546 1726776672.26097: WORKER PROCESS EXITING 8218 1726776672.26441: no more pending results, returning what we have 8218 1726776672.26444: results queue empty 8218 1726776672.26445: checking for any_errors_fatal 8218 1726776672.26448: done checking for any_errors_fatal 8218 1726776672.26448: checking for max_fail_percentage 8218 1726776672.26449: done checking for max_fail_percentage 8218 1726776672.26450: checking to see if all hosts have failed and the running result is not ok 8218 1726776672.26450: done checking to see if all hosts have failed 8218 1726776672.26451: getting the remaining hosts for this loop 8218 1726776672.26451: done getting the remaining hosts for this loop 8218 1726776672.26454: getting the next task for host managed_node2 8218 1726776672.26458: done getting next task for host managed_node2 8218 1726776672.26460: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8218 1726776672.26462: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776672.26469: getting variables 8218 1726776672.26470: in VariableManager get_vars() 8218 1726776672.26491: Calling all_inventory to load vars for managed_node2 8218 1726776672.26493: Calling groups_inventory to load vars for managed_node2 8218 1726776672.26494: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776672.26501: Calling all_plugins_play to load vars for managed_node2 8218 1726776672.26503: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776672.26504: Calling groups_plugins_play to load vars for managed_node2 8218 1726776672.26619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776672.26771: done with get_vars() 8218 1726776672.26779: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:11:12 -0400 (0:00:00.485) 0:00:58.099 **** 8218 1726776672.26851: entering _queue_task() for managed_node2/file 8218 1726776672.27034: worker is 1 (out of 1 available) 8218 1726776672.27047: exiting _queue_task() for managed_node2/file 8218 1726776672.27057: done queuing things up, now waiting for results queue to drain 8218 1726776672.27059: waiting for pending results... 10566 1726776672.27271: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 10566 1726776672.27404: in run() - task 120fa90a-8a95-cec2-986e-00000000030b 10566 1726776672.27421: variable 'ansible_search_path' from source: unknown 10566 1726776672.27425: variable 'ansible_search_path' from source: unknown 10566 1726776672.27458: calling self._execute() 10566 1726776672.27534: variable 'ansible_host' from source: host vars for 'managed_node2' 10566 1726776672.27544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10566 1726776672.27552: variable 'omit' from source: magic vars 10566 1726776672.27643: variable 'omit' from source: magic vars 10566 1726776672.27690: variable 'omit' from source: magic vars 10566 1726776672.27716: variable '__kernel_settings_profile_dir' from source: role '' all vars 10566 1726776672.27995: variable '__kernel_settings_profile_dir' from source: role '' all vars 10566 1726776672.28090: variable '__kernel_settings_profile_parent' from source: set_fact 10566 1726776672.28099: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10566 1726776672.28148: variable 'omit' from source: magic vars 10566 1726776672.28186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10566 1726776672.28219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10566 1726776672.28243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10566 1726776672.28260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10566 1726776672.28272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10566 1726776672.28299: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10566 1726776672.28306: variable 'ansible_host' from source: host vars for 'managed_node2' 10566 1726776672.28310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10566 1726776672.28402: Set connection var ansible_connection to ssh 10566 1726776672.28411: Set connection var ansible_pipelining to False 10566 1726776672.28417: Set connection var ansible_timeout to 10 10566 1726776672.28424: Set connection var ansible_module_compression to ZIP_DEFLATED 10566 1726776672.28431: Set connection var ansible_shell_type to sh 10566 1726776672.28437: Set connection var ansible_shell_executable to /bin/sh 10566 1726776672.28456: variable 'ansible_shell_executable' from source: unknown 10566 1726776672.28461: variable 'ansible_connection' from source: unknown 10566 1726776672.28464: variable 'ansible_module_compression' from source: unknown 10566 1726776672.28467: variable 'ansible_shell_type' from source: unknown 10566 1726776672.28469: variable 'ansible_shell_executable' from source: unknown 10566 1726776672.28472: variable 'ansible_host' from source: host vars for 'managed_node2' 10566 1726776672.28476: variable 'ansible_pipelining' from source: unknown 10566 1726776672.28478: variable 'ansible_timeout' from source: unknown 10566 1726776672.28482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10566 1726776672.28661: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10566 1726776672.28673: variable 'omit' from source: magic vars 10566 1726776672.28679: starting attempt loop 10566 1726776672.28682: running the handler 10566 1726776672.28693: _low_level_execute_command(): starting 10566 1726776672.28702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10566 1726776672.31236: stdout chunk (state=2): >>>/root <<< 10566 1726776672.31687: stderr chunk (state=3): >>><<< 10566 1726776672.31695: stdout chunk (state=3): >>><<< 10566 1726776672.31717: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10566 1726776672.31733: _low_level_execute_command(): starting 10566 1726776672.31739: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777 `" && echo ansible-tmp-1726776672.3172612-10566-166652314510777="` echo /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777 `" ) && sleep 0' 10566 1726776672.35700: stdout chunk (state=2): >>>ansible-tmp-1726776672.3172612-10566-166652314510777=/root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777 <<< 10566 1726776672.35827: stderr chunk (state=3): >>><<< 10566 1726776672.35835: stdout chunk (state=3): >>><<< 10566 1726776672.35853: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776672.3172612-10566-166652314510777=/root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777 , stderr= 10566 1726776672.35888: variable 'ansible_module_compression' from source: unknown 10566 1726776672.35931: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10566 1726776672.35964: variable 'ansible_facts' from source: unknown 10566 1726776672.36032: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777/AnsiballZ_file.py 10566 1726776672.36128: Sending initial data 10566 1726776672.36138: Sent initial data (152 bytes) 10566 1726776672.39134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp7thj54sb /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777/AnsiballZ_file.py <<< 10566 1726776672.40626: stderr chunk (state=3): >>><<< 10566 1726776672.40635: stdout chunk (state=3): >>><<< 10566 1726776672.40658: done transferring module to remote 10566 1726776672.40669: _low_level_execute_command(): starting 10566 1726776672.40675: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777/ /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777/AnsiballZ_file.py && sleep 0' 10566 1726776672.43025: stderr chunk (state=2): >>><<< 10566 1726776672.43034: stdout chunk (state=2): >>><<< 10566 1726776672.43050: _low_level_execute_command() done: rc=0, stdout=, stderr= 10566 1726776672.43054: _low_level_execute_command(): starting 10566 1726776672.43059: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777/AnsiballZ_file.py && sleep 0' 10566 1726776672.59315: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10566 1726776672.60441: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10566 1726776672.60493: stderr chunk (state=3): >>><<< 10566 1726776672.60500: stdout chunk (state=3): >>><<< 10566 1726776672.60522: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10566 1726776672.60558: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10566 1726776672.60569: _low_level_execute_command(): starting 10566 1726776672.60574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776672.3172612-10566-166652314510777/ > /dev/null 2>&1 && sleep 0' 10566 1726776672.63180: stderr chunk (state=2): >>><<< 10566 1726776672.63188: stdout chunk (state=2): >>><<< 10566 1726776672.63204: _low_level_execute_command() done: rc=0, stdout=, stderr= 10566 1726776672.63213: handler run complete 10566 1726776672.63233: attempt loop complete, returning result 10566 1726776672.63237: _execute() done 10566 1726776672.63240: dumping result to json 10566 1726776672.63247: done dumping result, returning 10566 1726776672.63255: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-cec2-986e-00000000030b] 10566 1726776672.63262: sending task result for task 120fa90a-8a95-cec2-986e-00000000030b 10566 1726776672.63293: done sending task result for task 120fa90a-8a95-cec2-986e-00000000030b 10566 1726776672.63297: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8218 1726776672.63454: no more pending results, returning what we have 8218 1726776672.63457: results queue empty 8218 1726776672.63458: checking for any_errors_fatal 8218 1726776672.63474: done checking for any_errors_fatal 8218 1726776672.63474: checking for max_fail_percentage 8218 1726776672.63476: done checking for max_fail_percentage 8218 1726776672.63477: checking to see if all hosts have failed and the running result is not ok 8218 1726776672.63477: done checking to see if all hosts have failed 8218 1726776672.63478: getting the remaining hosts for this loop 8218 1726776672.63479: done getting the remaining hosts for this loop 8218 1726776672.63482: getting the next task for host managed_node2 8218 1726776672.63489: done getting next task for host managed_node2 8218 1726776672.63492: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8218 1726776672.63495: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776672.63505: getting variables 8218 1726776672.63506: in VariableManager get_vars() 8218 1726776672.63542: Calling all_inventory to load vars for managed_node2 8218 1726776672.63545: Calling groups_inventory to load vars for managed_node2 8218 1726776672.63547: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776672.63554: Calling all_plugins_play to load vars for managed_node2 8218 1726776672.63556: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776672.63558: Calling groups_plugins_play to load vars for managed_node2 8218 1726776672.63667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776672.63790: done with get_vars() 8218 1726776672.63799: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:11:12 -0400 (0:00:00.370) 0:00:58.469 **** 8218 1726776672.63872: entering _queue_task() for managed_node2/slurp 8218 1726776672.64033: worker is 1 (out of 1 available) 8218 1726776672.64048: exiting _queue_task() for managed_node2/slurp 8218 1726776672.64060: done queuing things up, now waiting for results queue to drain 8218 1726776672.64062: waiting for pending results... 10581 1726776672.64194: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 10581 1726776672.64303: in run() - task 120fa90a-8a95-cec2-986e-00000000030c 10581 1726776672.64319: variable 'ansible_search_path' from source: unknown 10581 1726776672.64323: variable 'ansible_search_path' from source: unknown 10581 1726776672.64355: calling self._execute() 10581 1726776672.64423: variable 'ansible_host' from source: host vars for 'managed_node2' 10581 1726776672.64433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10581 1726776672.64441: variable 'omit' from source: magic vars 10581 1726776672.64515: variable 'omit' from source: magic vars 10581 1726776672.64563: variable 'omit' from source: magic vars 10581 1726776672.64590: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10581 1726776672.64807: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10581 1726776672.64873: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10581 1726776672.64902: variable 'omit' from source: magic vars 10581 1726776672.64936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10581 1726776672.64963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10581 1726776672.64981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10581 1726776672.64995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10581 1726776672.65006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10581 1726776672.65127: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10581 1726776672.65136: variable 'ansible_host' from source: host vars for 'managed_node2' 10581 1726776672.65141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10581 1726776672.65241: Set connection var ansible_connection to ssh 10581 1726776672.65249: Set connection var ansible_pipelining to False 10581 1726776672.65256: Set connection var ansible_timeout to 10 10581 1726776672.65265: Set connection var ansible_module_compression to ZIP_DEFLATED 10581 1726776672.65270: Set connection var ansible_shell_type to sh 10581 1726776672.65275: Set connection var ansible_shell_executable to /bin/sh 10581 1726776672.65294: variable 'ansible_shell_executable' from source: unknown 10581 1726776672.65298: variable 'ansible_connection' from source: unknown 10581 1726776672.65301: variable 'ansible_module_compression' from source: unknown 10581 1726776672.65304: variable 'ansible_shell_type' from source: unknown 10581 1726776672.65306: variable 'ansible_shell_executable' from source: unknown 10581 1726776672.65309: variable 'ansible_host' from source: host vars for 'managed_node2' 10581 1726776672.65312: variable 'ansible_pipelining' from source: unknown 10581 1726776672.65315: variable 'ansible_timeout' from source: unknown 10581 1726776672.65318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10581 1726776672.65494: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10581 1726776672.65505: variable 'omit' from source: magic vars 10581 1726776672.65511: starting attempt loop 10581 1726776672.65514: running the handler 10581 1726776672.65527: _low_level_execute_command(): starting 10581 1726776672.65536: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10581 1726776672.67897: stdout chunk (state=2): >>>/root <<< 10581 1726776672.68015: stderr chunk (state=3): >>><<< 10581 1726776672.68021: stdout chunk (state=3): >>><<< 10581 1726776672.68041: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10581 1726776672.68057: _low_level_execute_command(): starting 10581 1726776672.68063: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330 `" && echo ansible-tmp-1726776672.6805222-10581-52718098947330="` echo /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330 `" ) && sleep 0' 10581 1726776672.70705: stdout chunk (state=2): >>>ansible-tmp-1726776672.6805222-10581-52718098947330=/root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330 <<< 10581 1726776672.70837: stderr chunk (state=3): >>><<< 10581 1726776672.70844: stdout chunk (state=3): >>><<< 10581 1726776672.70861: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776672.6805222-10581-52718098947330=/root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330 , stderr= 10581 1726776672.70899: variable 'ansible_module_compression' from source: unknown 10581 1726776672.70934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10581 1726776672.70966: variable 'ansible_facts' from source: unknown 10581 1726776672.71038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330/AnsiballZ_slurp.py 10581 1726776672.71214: Sending initial data 10581 1726776672.71223: Sent initial data (152 bytes) 10581 1726776672.73681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp9dp_a1od /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330/AnsiballZ_slurp.py <<< 10581 1726776672.75058: stderr chunk (state=3): >>><<< 10581 1726776672.75071: stdout chunk (state=3): >>><<< 10581 1726776672.75096: done transferring module to remote 10581 1726776672.75109: _low_level_execute_command(): starting 10581 1726776672.75115: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330/ /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330/AnsiballZ_slurp.py && sleep 0' 10581 1726776672.77527: stderr chunk (state=2): >>><<< 10581 1726776672.77539: stdout chunk (state=2): >>><<< 10581 1726776672.77556: _low_level_execute_command() done: rc=0, stdout=, stderr= 10581 1726776672.77560: _low_level_execute_command(): starting 10581 1726776672.77565: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330/AnsiballZ_slurp.py && sleep 0' 10581 1726776672.92672: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 10581 1726776672.93802: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10581 1726776672.93814: stdout chunk (state=3): >>><<< 10581 1726776672.93825: stderr chunk (state=3): >>><<< 10581 1726776672.93842: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.12.75 closed. 10581 1726776672.93870: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10581 1726776672.93882: _low_level_execute_command(): starting 10581 1726776672.93888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776672.6805222-10581-52718098947330/ > /dev/null 2>&1 && sleep 0' 10581 1726776672.96599: stderr chunk (state=2): >>><<< 10581 1726776672.96611: stdout chunk (state=2): >>><<< 10581 1726776672.96631: _low_level_execute_command() done: rc=0, stdout=, stderr= 10581 1726776672.96640: handler run complete 10581 1726776672.96656: attempt loop complete, returning result 10581 1726776672.96661: _execute() done 10581 1726776672.96663: dumping result to json 10581 1726776672.96667: done dumping result, returning 10581 1726776672.96674: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-cec2-986e-00000000030c] 10581 1726776672.96680: sending task result for task 120fa90a-8a95-cec2-986e-00000000030c 10581 1726776672.96716: done sending task result for task 120fa90a-8a95-cec2-986e-00000000030c 10581 1726776672.96720: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8218 1726776672.97116: no more pending results, returning what we have 8218 1726776672.97120: results queue empty 8218 1726776672.97121: checking for any_errors_fatal 8218 1726776672.97133: done checking for any_errors_fatal 8218 1726776672.97134: checking for max_fail_percentage 8218 1726776672.97136: done checking for max_fail_percentage 8218 1726776672.97137: checking to see if all hosts have failed and the running result is not ok 8218 1726776672.97138: done checking to see if all hosts have failed 8218 1726776672.97138: getting the remaining hosts for this loop 8218 1726776672.97140: done getting the remaining hosts for this loop 8218 1726776672.97143: getting the next task for host managed_node2 8218 1726776672.97150: done getting next task for host managed_node2 8218 1726776672.97154: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8218 1726776672.97156: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776672.97168: getting variables 8218 1726776672.97169: in VariableManager get_vars() 8218 1726776672.97207: Calling all_inventory to load vars for managed_node2 8218 1726776672.97210: Calling groups_inventory to load vars for managed_node2 8218 1726776672.97213: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776672.97223: Calling all_plugins_play to load vars for managed_node2 8218 1726776672.97226: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776672.97230: Calling groups_plugins_play to load vars for managed_node2 8218 1726776672.97410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776672.97618: done with get_vars() 8218 1726776672.97633: done getting variables 8218 1726776672.97694: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:11:12 -0400 (0:00:00.338) 0:00:58.807 **** 8218 1726776672.97731: entering _queue_task() for managed_node2/set_fact 8218 1726776672.97943: worker is 1 (out of 1 available) 8218 1726776672.97957: exiting _queue_task() for managed_node2/set_fact 8218 1726776672.97971: done queuing things up, now waiting for results queue to drain 8218 1726776672.97972: waiting for pending results... 10600 1726776672.98200: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 10600 1726776672.98342: in run() - task 120fa90a-8a95-cec2-986e-00000000030d 10600 1726776672.98361: variable 'ansible_search_path' from source: unknown 10600 1726776672.98366: variable 'ansible_search_path' from source: unknown 10600 1726776672.98398: calling self._execute() 10600 1726776672.98599: variable 'ansible_host' from source: host vars for 'managed_node2' 10600 1726776672.98611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10600 1726776672.98621: variable 'omit' from source: magic vars 10600 1726776672.98722: variable 'omit' from source: magic vars 10600 1726776672.98770: variable 'omit' from source: magic vars 10600 1726776672.99216: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10600 1726776672.99227: variable '__cur_profile' from source: task vars 10600 1726776672.99377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10600 1726776673.01583: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10600 1726776673.01661: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10600 1726776673.01699: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10600 1726776673.01736: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10600 1726776673.01762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10600 1726776673.01831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10600 1726776673.01856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10600 1726776673.01877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10600 1726776673.01909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10600 1726776673.01920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10600 1726776673.02020: variable '__kernel_settings_tuned_current_profile' from source: set_fact 10600 1726776673.02074: variable 'omit' from source: magic vars 10600 1726776673.02103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10600 1726776673.02127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10600 1726776673.02147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10600 1726776673.02161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10600 1726776673.02171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10600 1726776673.02199: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10600 1726776673.02206: variable 'ansible_host' from source: host vars for 'managed_node2' 10600 1726776673.02210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10600 1726776673.02305: Set connection var ansible_connection to ssh 10600 1726776673.02313: Set connection var ansible_pipelining to False 10600 1726776673.02320: Set connection var ansible_timeout to 10 10600 1726776673.02328: Set connection var ansible_module_compression to ZIP_DEFLATED 10600 1726776673.02383: Set connection var ansible_shell_type to sh 10600 1726776673.02389: Set connection var ansible_shell_executable to /bin/sh 10600 1726776673.02410: variable 'ansible_shell_executable' from source: unknown 10600 1726776673.02414: variable 'ansible_connection' from source: unknown 10600 1726776673.02417: variable 'ansible_module_compression' from source: unknown 10600 1726776673.02420: variable 'ansible_shell_type' from source: unknown 10600 1726776673.02422: variable 'ansible_shell_executable' from source: unknown 10600 1726776673.02425: variable 'ansible_host' from source: host vars for 'managed_node2' 10600 1726776673.02431: variable 'ansible_pipelining' from source: unknown 10600 1726776673.02434: variable 'ansible_timeout' from source: unknown 10600 1726776673.02437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10600 1726776673.02523: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10600 1726776673.02538: variable 'omit' from source: magic vars 10600 1726776673.02544: starting attempt loop 10600 1726776673.02547: running the handler 10600 1726776673.02557: handler run complete 10600 1726776673.02566: attempt loop complete, returning result 10600 1726776673.02569: _execute() done 10600 1726776673.02571: dumping result to json 10600 1726776673.02574: done dumping result, returning 10600 1726776673.02581: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-cec2-986e-00000000030d] 10600 1726776673.02586: sending task result for task 120fa90a-8a95-cec2-986e-00000000030d 10600 1726776673.02611: done sending task result for task 120fa90a-8a95-cec2-986e-00000000030d 10600 1726776673.02614: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8218 1726776673.03305: no more pending results, returning what we have 8218 1726776673.03309: results queue empty 8218 1726776673.03310: checking for any_errors_fatal 8218 1726776673.03314: done checking for any_errors_fatal 8218 1726776673.03315: checking for max_fail_percentage 8218 1726776673.03317: done checking for max_fail_percentage 8218 1726776673.03318: checking to see if all hosts have failed and the running result is not ok 8218 1726776673.03319: done checking to see if all hosts have failed 8218 1726776673.03319: getting the remaining hosts for this loop 8218 1726776673.03321: done getting the remaining hosts for this loop 8218 1726776673.03324: getting the next task for host managed_node2 8218 1726776673.03332: done getting next task for host managed_node2 8218 1726776673.03336: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8218 1726776673.03338: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776673.03358: getting variables 8218 1726776673.03360: in VariableManager get_vars() 8218 1726776673.03387: Calling all_inventory to load vars for managed_node2 8218 1726776673.03390: Calling groups_inventory to load vars for managed_node2 8218 1726776673.03392: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776673.03403: Calling all_plugins_play to load vars for managed_node2 8218 1726776673.03407: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776673.03410: Calling groups_plugins_play to load vars for managed_node2 8218 1726776673.03569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776673.03764: done with get_vars() 8218 1726776673.03774: done getting variables 8218 1726776673.03823: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:11:13 -0400 (0:00:00.061) 0:00:58.869 **** 8218 1726776673.03857: entering _queue_task() for managed_node2/copy 8218 1726776673.04053: worker is 1 (out of 1 available) 8218 1726776673.04065: exiting _queue_task() for managed_node2/copy 8218 1726776673.04075: done queuing things up, now waiting for results queue to drain 8218 1726776673.04077: waiting for pending results... 10602 1726776673.04337: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 10602 1726776673.04466: in run() - task 120fa90a-8a95-cec2-986e-00000000030e 10602 1726776673.04482: variable 'ansible_search_path' from source: unknown 10602 1726776673.04486: variable 'ansible_search_path' from source: unknown 10602 1726776673.04515: calling self._execute() 10602 1726776673.04592: variable 'ansible_host' from source: host vars for 'managed_node2' 10602 1726776673.04603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10602 1726776673.04613: variable 'omit' from source: magic vars 10602 1726776673.04715: variable 'omit' from source: magic vars 10602 1726776673.04764: variable 'omit' from source: magic vars 10602 1726776673.04791: variable '__kernel_settings_active_profile' from source: set_fact 10602 1726776673.05078: variable '__kernel_settings_active_profile' from source: set_fact 10602 1726776673.05102: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10602 1726776673.05177: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10602 1726776673.05297: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10602 1726776673.05325: variable 'omit' from source: magic vars 10602 1726776673.05369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10602 1726776673.05404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10602 1726776673.05425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10602 1726776673.05444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10602 1726776673.05458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10602 1726776673.05485: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10602 1726776673.05490: variable 'ansible_host' from source: host vars for 'managed_node2' 10602 1726776673.05494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10602 1726776673.05589: Set connection var ansible_connection to ssh 10602 1726776673.05598: Set connection var ansible_pipelining to False 10602 1726776673.05605: Set connection var ansible_timeout to 10 10602 1726776673.05613: Set connection var ansible_module_compression to ZIP_DEFLATED 10602 1726776673.05619: Set connection var ansible_shell_type to sh 10602 1726776673.05624: Set connection var ansible_shell_executable to /bin/sh 10602 1726776673.05647: variable 'ansible_shell_executable' from source: unknown 10602 1726776673.05652: variable 'ansible_connection' from source: unknown 10602 1726776673.05656: variable 'ansible_module_compression' from source: unknown 10602 1726776673.05659: variable 'ansible_shell_type' from source: unknown 10602 1726776673.05662: variable 'ansible_shell_executable' from source: unknown 10602 1726776673.05664: variable 'ansible_host' from source: host vars for 'managed_node2' 10602 1726776673.05668: variable 'ansible_pipelining' from source: unknown 10602 1726776673.05671: variable 'ansible_timeout' from source: unknown 10602 1726776673.05674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10602 1726776673.05792: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10602 1726776673.05805: variable 'omit' from source: magic vars 10602 1726776673.05811: starting attempt loop 10602 1726776673.05814: running the handler 10602 1726776673.05825: _low_level_execute_command(): starting 10602 1726776673.05834: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10602 1726776673.09038: stdout chunk (state=2): >>>/root <<< 10602 1726776673.09410: stderr chunk (state=3): >>><<< 10602 1726776673.09418: stdout chunk (state=3): >>><<< 10602 1726776673.09441: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10602 1726776673.09457: _low_level_execute_command(): starting 10602 1726776673.09464: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011 `" && echo ansible-tmp-1726776673.0945036-10602-210055409405011="` echo /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011 `" ) && sleep 0' 10602 1726776673.12486: stdout chunk (state=2): >>>ansible-tmp-1726776673.0945036-10602-210055409405011=/root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011 <<< 10602 1726776673.12614: stderr chunk (state=3): >>><<< 10602 1726776673.12620: stdout chunk (state=3): >>><<< 10602 1726776673.12635: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776673.0945036-10602-210055409405011=/root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011 , stderr= 10602 1726776673.12701: variable 'ansible_module_compression' from source: unknown 10602 1726776673.12748: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10602 1726776673.12777: variable 'ansible_facts' from source: unknown 10602 1726776673.12845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_stat.py 10602 1726776673.12924: Sending initial data 10602 1726776673.12933: Sent initial data (152 bytes) 10602 1726776673.15460: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmphd8cmhr4 /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_stat.py <<< 10602 1726776673.16537: stderr chunk (state=3): >>><<< 10602 1726776673.16543: stdout chunk (state=3): >>><<< 10602 1726776673.16562: done transferring module to remote 10602 1726776673.16571: _low_level_execute_command(): starting 10602 1726776673.16576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/ /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_stat.py && sleep 0' 10602 1726776673.18944: stderr chunk (state=2): >>><<< 10602 1726776673.18951: stdout chunk (state=2): >>><<< 10602 1726776673.18964: _low_level_execute_command() done: rc=0, stdout=, stderr= 10602 1726776673.18968: _low_level_execute_command(): starting 10602 1726776673.18973: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_stat.py && sleep 0' 10602 1726776673.35528: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776672.9244826, "mtime": 1726776657.7796376, "ctime": 1726776657.7796376, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10602 1726776673.36731: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10602 1726776673.36778: stderr chunk (state=3): >>><<< 10602 1726776673.36786: stdout chunk (state=3): >>><<< 10602 1726776673.36803: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776672.9244826, "mtime": 1726776657.7796376, "ctime": 1726776657.7796376, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 10602 1726776673.36851: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10602 1726776673.36889: variable 'ansible_module_compression' from source: unknown 10602 1726776673.36923: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10602 1726776673.36942: variable 'ansible_facts' from source: unknown 10602 1726776673.37000: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_file.py 10602 1726776673.37103: Sending initial data 10602 1726776673.37111: Sent initial data (152 bytes) 10602 1726776673.39867: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp659zrf7r /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_file.py <<< 10602 1726776673.42043: stderr chunk (state=3): >>><<< 10602 1726776673.42054: stdout chunk (state=3): >>><<< 10602 1726776673.42074: done transferring module to remote 10602 1726776673.42084: _low_level_execute_command(): starting 10602 1726776673.42088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/ /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_file.py && sleep 0' 10602 1726776673.44912: stderr chunk (state=2): >>><<< 10602 1726776673.44923: stdout chunk (state=2): >>><<< 10602 1726776673.44943: _low_level_execute_command() done: rc=0, stdout=, stderr= 10602 1726776673.44949: _low_level_execute_command(): starting 10602 1726776673.44954: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/AnsiballZ_file.py && sleep 0' 10602 1726776673.61615: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp3q6yjzmx", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10602 1726776673.62813: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10602 1726776673.62825: stdout chunk (state=3): >>><<< 10602 1726776673.62839: stderr chunk (state=3): >>><<< 10602 1726776673.62854: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp3q6yjzmx", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10602 1726776673.62889: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmp3q6yjzmx', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10602 1726776673.62902: _low_level_execute_command(): starting 10602 1726776673.62908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776673.0945036-10602-210055409405011/ > /dev/null 2>&1 && sleep 0' 10602 1726776673.65935: stderr chunk (state=2): >>><<< 10602 1726776673.65946: stdout chunk (state=2): >>><<< 10602 1726776673.65965: _low_level_execute_command() done: rc=0, stdout=, stderr= 10602 1726776673.65974: handler run complete 10602 1726776673.66003: attempt loop complete, returning result 10602 1726776673.66009: _execute() done 10602 1726776673.66012: dumping result to json 10602 1726776673.66018: done dumping result, returning 10602 1726776673.66026: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-cec2-986e-00000000030e] 10602 1726776673.66036: sending task result for task 120fa90a-8a95-cec2-986e-00000000030e 10602 1726776673.66081: done sending task result for task 120fa90a-8a95-cec2-986e-00000000030e 10602 1726776673.66086: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8218 1726776673.66500: no more pending results, returning what we have 8218 1726776673.66503: results queue empty 8218 1726776673.66504: checking for any_errors_fatal 8218 1726776673.66510: done checking for any_errors_fatal 8218 1726776673.66511: checking for max_fail_percentage 8218 1726776673.66512: done checking for max_fail_percentage 8218 1726776673.66513: checking to see if all hosts have failed and the running result is not ok 8218 1726776673.66514: done checking to see if all hosts have failed 8218 1726776673.66515: getting the remaining hosts for this loop 8218 1726776673.66516: done getting the remaining hosts for this loop 8218 1726776673.66519: getting the next task for host managed_node2 8218 1726776673.66524: done getting next task for host managed_node2 8218 1726776673.66530: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8218 1726776673.66533: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776673.66543: getting variables 8218 1726776673.66544: in VariableManager get_vars() 8218 1726776673.66580: Calling all_inventory to load vars for managed_node2 8218 1726776673.66583: Calling groups_inventory to load vars for managed_node2 8218 1726776673.66585: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776673.66594: Calling all_plugins_play to load vars for managed_node2 8218 1726776673.66597: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776673.66600: Calling groups_plugins_play to load vars for managed_node2 8218 1726776673.66773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776673.67013: done with get_vars() 8218 1726776673.67024: done getting variables 8218 1726776673.67085: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:11:13 -0400 (0:00:00.632) 0:00:59.501 **** 8218 1726776673.67116: entering _queue_task() for managed_node2/copy 8218 1726776673.67305: worker is 1 (out of 1 available) 8218 1726776673.67318: exiting _queue_task() for managed_node2/copy 8218 1726776673.67332: done queuing things up, now waiting for results queue to drain 8218 1726776673.67335: waiting for pending results... 10631 1726776673.67570: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10631 1726776673.67700: in run() - task 120fa90a-8a95-cec2-986e-00000000030f 10631 1726776673.67717: variable 'ansible_search_path' from source: unknown 10631 1726776673.67721: variable 'ansible_search_path' from source: unknown 10631 1726776673.67755: calling self._execute() 10631 1726776673.67848: variable 'ansible_host' from source: host vars for 'managed_node2' 10631 1726776673.67858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10631 1726776673.67868: variable 'omit' from source: magic vars 10631 1726776673.67971: variable 'omit' from source: magic vars 10631 1726776673.68018: variable 'omit' from source: magic vars 10631 1726776673.68049: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10631 1726776673.68331: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10631 1726776673.68412: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10631 1726776673.68496: variable 'omit' from source: magic vars 10631 1726776673.68538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10631 1726776673.68573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10631 1726776673.68594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10631 1726776673.68612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10631 1726776673.68624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10631 1726776673.68653: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10631 1726776673.68660: variable 'ansible_host' from source: host vars for 'managed_node2' 10631 1726776673.68664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10631 1726776673.68759: Set connection var ansible_connection to ssh 10631 1726776673.68768: Set connection var ansible_pipelining to False 10631 1726776673.68774: Set connection var ansible_timeout to 10 10631 1726776673.68782: Set connection var ansible_module_compression to ZIP_DEFLATED 10631 1726776673.68787: Set connection var ansible_shell_type to sh 10631 1726776673.68793: Set connection var ansible_shell_executable to /bin/sh 10631 1726776673.68812: variable 'ansible_shell_executable' from source: unknown 10631 1726776673.68817: variable 'ansible_connection' from source: unknown 10631 1726776673.68820: variable 'ansible_module_compression' from source: unknown 10631 1726776673.68823: variable 'ansible_shell_type' from source: unknown 10631 1726776673.68826: variable 'ansible_shell_executable' from source: unknown 10631 1726776673.68831: variable 'ansible_host' from source: host vars for 'managed_node2' 10631 1726776673.68835: variable 'ansible_pipelining' from source: unknown 10631 1726776673.68838: variable 'ansible_timeout' from source: unknown 10631 1726776673.68842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10631 1726776673.68962: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10631 1726776673.68975: variable 'omit' from source: magic vars 10631 1726776673.68980: starting attempt loop 10631 1726776673.68983: running the handler 10631 1726776673.68994: _low_level_execute_command(): starting 10631 1726776673.69001: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10631 1726776673.71917: stdout chunk (state=2): >>>/root <<< 10631 1726776673.72027: stderr chunk (state=3): >>><<< 10631 1726776673.72040: stdout chunk (state=3): >>><<< 10631 1726776673.72059: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10631 1726776673.72072: _low_level_execute_command(): starting 10631 1726776673.72077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479 `" && echo ansible-tmp-1726776673.7206705-10631-222299045576479="` echo /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479 `" ) && sleep 0' 10631 1726776673.74687: stdout chunk (state=2): >>>ansible-tmp-1726776673.7206705-10631-222299045576479=/root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479 <<< 10631 1726776673.74801: stderr chunk (state=3): >>><<< 10631 1726776673.74807: stdout chunk (state=3): >>><<< 10631 1726776673.74820: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776673.7206705-10631-222299045576479=/root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479 , stderr= 10631 1726776673.74885: variable 'ansible_module_compression' from source: unknown 10631 1726776673.74930: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10631 1726776673.74960: variable 'ansible_facts' from source: unknown 10631 1726776673.75027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_stat.py 10631 1726776673.75109: Sending initial data 10631 1726776673.75116: Sent initial data (152 bytes) 10631 1726776673.77713: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmphrf8fy7x /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_stat.py <<< 10631 1726776673.78790: stderr chunk (state=3): >>><<< 10631 1726776673.78797: stdout chunk (state=3): >>><<< 10631 1726776673.78813: done transferring module to remote 10631 1726776673.78822: _low_level_execute_command(): starting 10631 1726776673.78825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/ /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_stat.py && sleep 0' 10631 1726776673.81274: stderr chunk (state=2): >>><<< 10631 1726776673.81283: stdout chunk (state=2): >>><<< 10631 1726776673.81296: _low_level_execute_command() done: rc=0, stdout=, stderr= 10631 1726776673.81300: _low_level_execute_command(): starting 10631 1726776673.81305: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_stat.py && sleep 0' 10631 1726776673.98083: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776657.6376371, "mtime": 1726776657.7806377, "ctime": 1726776657.7806377, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10631 1726776673.99256: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10631 1726776673.99305: stderr chunk (state=3): >>><<< 10631 1726776673.99312: stdout chunk (state=3): >>><<< 10631 1726776673.99330: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776657.6376371, "mtime": 1726776657.7806377, "ctime": 1726776657.7806377, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 10631 1726776673.99376: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10631 1726776673.99414: variable 'ansible_module_compression' from source: unknown 10631 1726776673.99451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10631 1726776673.99469: variable 'ansible_facts' from source: unknown 10631 1726776673.99525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_file.py 10631 1726776673.99614: Sending initial data 10631 1726776673.99620: Sent initial data (152 bytes) 10631 1726776674.02198: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpgo7ksbtc /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_file.py <<< 10631 1726776674.03310: stderr chunk (state=3): >>><<< 10631 1726776674.03318: stdout chunk (state=3): >>><<< 10631 1726776674.03338: done transferring module to remote 10631 1726776674.03349: _low_level_execute_command(): starting 10631 1726776674.03355: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/ /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_file.py && sleep 0' 10631 1726776674.05739: stderr chunk (state=2): >>><<< 10631 1726776674.05750: stdout chunk (state=2): >>><<< 10631 1726776674.05764: _low_level_execute_command() done: rc=0, stdout=, stderr= 10631 1726776674.05768: _low_level_execute_command(): starting 10631 1726776674.05773: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/AnsiballZ_file.py && sleep 0' 10631 1726776674.22069: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpjsxdpqta", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10631 1726776674.23155: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10631 1726776674.23201: stderr chunk (state=3): >>><<< 10631 1726776674.23210: stdout chunk (state=3): >>><<< 10631 1726776674.23231: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpjsxdpqta", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10631 1726776674.23262: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpjsxdpqta', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10631 1726776674.23274: _low_level_execute_command(): starting 10631 1726776674.23279: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776673.7206705-10631-222299045576479/ > /dev/null 2>&1 && sleep 0' 10631 1726776674.25685: stderr chunk (state=2): >>><<< 10631 1726776674.25692: stdout chunk (state=2): >>><<< 10631 1726776674.25706: _low_level_execute_command() done: rc=0, stdout=, stderr= 10631 1726776674.25715: handler run complete 10631 1726776674.25736: attempt loop complete, returning result 10631 1726776674.25740: _execute() done 10631 1726776674.25744: dumping result to json 10631 1726776674.25749: done dumping result, returning 10631 1726776674.25757: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-cec2-986e-00000000030f] 10631 1726776674.25764: sending task result for task 120fa90a-8a95-cec2-986e-00000000030f 10631 1726776674.25799: done sending task result for task 120fa90a-8a95-cec2-986e-00000000030f 10631 1726776674.25804: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8218 1726776674.25988: no more pending results, returning what we have 8218 1726776674.25991: results queue empty 8218 1726776674.25992: checking for any_errors_fatal 8218 1726776674.25999: done checking for any_errors_fatal 8218 1726776674.26000: checking for max_fail_percentage 8218 1726776674.26001: done checking for max_fail_percentage 8218 1726776674.26002: checking to see if all hosts have failed and the running result is not ok 8218 1726776674.26003: done checking to see if all hosts have failed 8218 1726776674.26003: getting the remaining hosts for this loop 8218 1726776674.26004: done getting the remaining hosts for this loop 8218 1726776674.26007: getting the next task for host managed_node2 8218 1726776674.26012: done getting next task for host managed_node2 8218 1726776674.26015: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8218 1726776674.26017: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776674.26027: getting variables 8218 1726776674.26030: in VariableManager get_vars() 8218 1726776674.26061: Calling all_inventory to load vars for managed_node2 8218 1726776674.26063: Calling groups_inventory to load vars for managed_node2 8218 1726776674.26064: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776674.26071: Calling all_plugins_play to load vars for managed_node2 8218 1726776674.26073: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776674.26075: Calling groups_plugins_play to load vars for managed_node2 8218 1726776674.26188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776674.26312: done with get_vars() 8218 1726776674.26321: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:11:14 -0400 (0:00:00.592) 0:01:00.094 **** 8218 1726776674.26386: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776674.26546: worker is 1 (out of 1 available) 8218 1726776674.26563: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776674.26575: done queuing things up, now waiting for results queue to drain 8218 1726776674.26577: waiting for pending results... 10653 1726776674.26699: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10653 1726776674.26801: in run() - task 120fa90a-8a95-cec2-986e-000000000310 10653 1726776674.26816: variable 'ansible_search_path' from source: unknown 10653 1726776674.26820: variable 'ansible_search_path' from source: unknown 10653 1726776674.26849: calling self._execute() 10653 1726776674.26915: variable 'ansible_host' from source: host vars for 'managed_node2' 10653 1726776674.26925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10653 1726776674.26934: variable 'omit' from source: magic vars 10653 1726776674.27010: variable 'omit' from source: magic vars 10653 1726776674.27047: variable 'omit' from source: magic vars 10653 1726776674.27068: variable '__kernel_settings_profile_filename' from source: role '' all vars 10653 1726776674.27284: variable '__kernel_settings_profile_filename' from source: role '' all vars 10653 1726776674.27344: variable '__kernel_settings_profile_dir' from source: role '' all vars 10653 1726776674.27406: variable '__kernel_settings_profile_parent' from source: set_fact 10653 1726776674.27415: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10653 1726776674.27505: variable 'omit' from source: magic vars 10653 1726776674.27541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10653 1726776674.27569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10653 1726776674.27587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10653 1726776674.27600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10653 1726776674.27611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10653 1726776674.27637: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10653 1726776674.27643: variable 'ansible_host' from source: host vars for 'managed_node2' 10653 1726776674.27648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10653 1726776674.27713: Set connection var ansible_connection to ssh 10653 1726776674.27721: Set connection var ansible_pipelining to False 10653 1726776674.27727: Set connection var ansible_timeout to 10 10653 1726776674.27737: Set connection var ansible_module_compression to ZIP_DEFLATED 10653 1726776674.27743: Set connection var ansible_shell_type to sh 10653 1726776674.27748: Set connection var ansible_shell_executable to /bin/sh 10653 1726776674.27764: variable 'ansible_shell_executable' from source: unknown 10653 1726776674.27768: variable 'ansible_connection' from source: unknown 10653 1726776674.27771: variable 'ansible_module_compression' from source: unknown 10653 1726776674.27774: variable 'ansible_shell_type' from source: unknown 10653 1726776674.27777: variable 'ansible_shell_executable' from source: unknown 10653 1726776674.27780: variable 'ansible_host' from source: host vars for 'managed_node2' 10653 1726776674.27783: variable 'ansible_pipelining' from source: unknown 10653 1726776674.27784: variable 'ansible_timeout' from source: unknown 10653 1726776674.27786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10653 1726776674.27908: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10653 1726776674.27917: variable 'omit' from source: magic vars 10653 1726776674.27921: starting attempt loop 10653 1726776674.27923: running the handler 10653 1726776674.27934: _low_level_execute_command(): starting 10653 1726776674.27940: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10653 1726776674.30217: stdout chunk (state=2): >>>/root <<< 10653 1726776674.30332: stderr chunk (state=3): >>><<< 10653 1726776674.30338: stdout chunk (state=3): >>><<< 10653 1726776674.30355: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10653 1726776674.30367: _low_level_execute_command(): starting 10653 1726776674.30373: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468 `" && echo ansible-tmp-1726776674.3036246-10653-65769683216468="` echo /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468 `" ) && sleep 0' 10653 1726776674.32938: stdout chunk (state=2): >>>ansible-tmp-1726776674.3036246-10653-65769683216468=/root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468 <<< 10653 1726776674.33066: stderr chunk (state=3): >>><<< 10653 1726776674.33072: stdout chunk (state=3): >>><<< 10653 1726776674.33085: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776674.3036246-10653-65769683216468=/root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468 , stderr= 10653 1726776674.33119: variable 'ansible_module_compression' from source: unknown 10653 1726776674.33152: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10653 1726776674.33183: variable 'ansible_facts' from source: unknown 10653 1726776674.33244: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468/AnsiballZ_kernel_settings_get_config.py 10653 1726776674.33337: Sending initial data 10653 1726776674.33344: Sent initial data (173 bytes) 10653 1726776674.35755: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpvh7blp4d /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468/AnsiballZ_kernel_settings_get_config.py <<< 10653 1726776674.36801: stderr chunk (state=3): >>><<< 10653 1726776674.36808: stdout chunk (state=3): >>><<< 10653 1726776674.36830: done transferring module to remote 10653 1726776674.36841: _low_level_execute_command(): starting 10653 1726776674.36846: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468/ /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10653 1726776674.39161: stderr chunk (state=2): >>><<< 10653 1726776674.39169: stdout chunk (state=2): >>><<< 10653 1726776674.39182: _low_level_execute_command() done: rc=0, stdout=, stderr= 10653 1726776674.39187: _low_level_execute_command(): starting 10653 1726776674.39192: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10653 1726776674.54946: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10653 1726776674.56019: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10653 1726776674.56070: stderr chunk (state=3): >>><<< 10653 1726776674.56078: stdout chunk (state=3): >>><<< 10653 1726776674.56095: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 10653 1726776674.56122: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10653 1726776674.56134: _low_level_execute_command(): starting 10653 1726776674.56140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776674.3036246-10653-65769683216468/ > /dev/null 2>&1 && sleep 0' 10653 1726776674.58571: stderr chunk (state=2): >>><<< 10653 1726776674.58579: stdout chunk (state=2): >>><<< 10653 1726776674.58593: _low_level_execute_command() done: rc=0, stdout=, stderr= 10653 1726776674.58600: handler run complete 10653 1726776674.58618: attempt loop complete, returning result 10653 1726776674.58623: _execute() done 10653 1726776674.58626: dumping result to json 10653 1726776674.58631: done dumping result, returning 10653 1726776674.58639: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-cec2-986e-000000000310] 10653 1726776674.58646: sending task result for task 120fa90a-8a95-cec2-986e-000000000310 10653 1726776674.58678: done sending task result for task 120fa90a-8a95-cec2-986e-000000000310 10653 1726776674.58682: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8218 1726776674.58830: no more pending results, returning what we have 8218 1726776674.58834: results queue empty 8218 1726776674.58834: checking for any_errors_fatal 8218 1726776674.58841: done checking for any_errors_fatal 8218 1726776674.58842: checking for max_fail_percentage 8218 1726776674.58843: done checking for max_fail_percentage 8218 1726776674.58844: checking to see if all hosts have failed and the running result is not ok 8218 1726776674.58845: done checking to see if all hosts have failed 8218 1726776674.58845: getting the remaining hosts for this loop 8218 1726776674.58846: done getting the remaining hosts for this loop 8218 1726776674.58849: getting the next task for host managed_node2 8218 1726776674.58855: done getting next task for host managed_node2 8218 1726776674.58858: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8218 1726776674.58861: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776674.58871: getting variables 8218 1726776674.58872: in VariableManager get_vars() 8218 1726776674.58905: Calling all_inventory to load vars for managed_node2 8218 1726776674.58908: Calling groups_inventory to load vars for managed_node2 8218 1726776674.58909: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776674.58918: Calling all_plugins_play to load vars for managed_node2 8218 1726776674.58920: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776674.58923: Calling groups_plugins_play to load vars for managed_node2 8218 1726776674.59076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776674.59195: done with get_vars() 8218 1726776674.59204: done getting variables 8218 1726776674.59250: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:11:14 -0400 (0:00:00.328) 0:01:00.423 **** 8218 1726776674.59275: entering _queue_task() for managed_node2/template 8218 1726776674.59437: worker is 1 (out of 1 available) 8218 1726776674.59450: exiting _queue_task() for managed_node2/template 8218 1726776674.59462: done queuing things up, now waiting for results queue to drain 8218 1726776674.59464: waiting for pending results... 10661 1726776674.59591: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 10661 1726776674.59697: in run() - task 120fa90a-8a95-cec2-986e-000000000311 10661 1726776674.59713: variable 'ansible_search_path' from source: unknown 10661 1726776674.59717: variable 'ansible_search_path' from source: unknown 10661 1726776674.59746: calling self._execute() 10661 1726776674.59815: variable 'ansible_host' from source: host vars for 'managed_node2' 10661 1726776674.59824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10661 1726776674.59834: variable 'omit' from source: magic vars 10661 1726776674.59909: variable 'omit' from source: magic vars 10661 1726776674.59946: variable 'omit' from source: magic vars 10661 1726776674.60189: variable '__kernel_settings_profile_src' from source: role '' all vars 10661 1726776674.60198: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10661 1726776674.60260: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10661 1726776674.60281: variable '__kernel_settings_profile_filename' from source: role '' all vars 10661 1726776674.60325: variable '__kernel_settings_profile_filename' from source: role '' all vars 10661 1726776674.60380: variable '__kernel_settings_profile_dir' from source: role '' all vars 10661 1726776674.60440: variable '__kernel_settings_profile_parent' from source: set_fact 10661 1726776674.60451: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10661 1726776674.60476: variable 'omit' from source: magic vars 10661 1726776674.60508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10661 1726776674.60536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10661 1726776674.60557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10661 1726776674.60573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10661 1726776674.60584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10661 1726776674.60608: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10661 1726776674.60614: variable 'ansible_host' from source: host vars for 'managed_node2' 10661 1726776674.60618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10661 1726776674.60690: Set connection var ansible_connection to ssh 10661 1726776674.60698: Set connection var ansible_pipelining to False 10661 1726776674.60704: Set connection var ansible_timeout to 10 10661 1726776674.60711: Set connection var ansible_module_compression to ZIP_DEFLATED 10661 1726776674.60716: Set connection var ansible_shell_type to sh 10661 1726776674.60721: Set connection var ansible_shell_executable to /bin/sh 10661 1726776674.60740: variable 'ansible_shell_executable' from source: unknown 10661 1726776674.60744: variable 'ansible_connection' from source: unknown 10661 1726776674.60751: variable 'ansible_module_compression' from source: unknown 10661 1726776674.60754: variable 'ansible_shell_type' from source: unknown 10661 1726776674.60758: variable 'ansible_shell_executable' from source: unknown 10661 1726776674.60761: variable 'ansible_host' from source: host vars for 'managed_node2' 10661 1726776674.60765: variable 'ansible_pipelining' from source: unknown 10661 1726776674.60768: variable 'ansible_timeout' from source: unknown 10661 1726776674.60772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10661 1726776674.60867: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10661 1726776674.60876: variable 'omit' from source: magic vars 10661 1726776674.60880: starting attempt loop 10661 1726776674.60883: running the handler 10661 1726776674.60892: _low_level_execute_command(): starting 10661 1726776674.60898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10661 1726776674.63224: stdout chunk (state=2): >>>/root <<< 10661 1726776674.63341: stderr chunk (state=3): >>><<< 10661 1726776674.63350: stdout chunk (state=3): >>><<< 10661 1726776674.63371: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10661 1726776674.63384: _low_level_execute_command(): starting 10661 1726776674.63390: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788 `" && echo ansible-tmp-1726776674.633794-10661-168136418473788="` echo /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788 `" ) && sleep 0' 10661 1726776674.67009: stdout chunk (state=2): >>>ansible-tmp-1726776674.633794-10661-168136418473788=/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788 <<< 10661 1726776674.67138: stderr chunk (state=3): >>><<< 10661 1726776674.67145: stdout chunk (state=3): >>><<< 10661 1726776674.67161: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776674.633794-10661-168136418473788=/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788 , stderr= 10661 1726776674.67178: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 10661 1726776674.67195: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 10661 1726776674.67216: variable 'ansible_search_path' from source: unknown 10661 1726776674.67766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10661 1726776674.69194: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10661 1726776674.69241: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10661 1726776674.69271: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10661 1726776674.69297: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10661 1726776674.69326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10661 1726776674.69510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10661 1726776674.69532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10661 1726776674.69555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10661 1726776674.69582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10661 1726776674.69594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10661 1726776674.69812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10661 1726776674.69832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10661 1726776674.69851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10661 1726776674.69877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10661 1726776674.69888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10661 1726776674.70138: variable 'ansible_managed' from source: unknown 10661 1726776674.70146: variable '__sections' from source: task vars 10661 1726776674.70235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10661 1726776674.70254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10661 1726776674.70271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10661 1726776674.70296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10661 1726776674.70307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10661 1726776674.70377: variable 'kernel_settings_sysctl' from source: include params 10661 1726776674.70388: variable '__kernel_settings_state_empty' from source: role '' all vars 10661 1726776674.70393: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10661 1726776674.70436: variable '__sysctl_old' from source: task vars 10661 1726776674.70484: variable '__sysctl_old' from source: task vars 10661 1726776674.70622: variable 'kernel_settings_purge' from source: role '' defaults 10661 1726776674.70630: variable 'kernel_settings_sysctl' from source: include params 10661 1726776674.70638: variable '__kernel_settings_state_empty' from source: role '' all vars 10661 1726776674.70643: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10661 1726776674.70650: variable '__kernel_settings_profile_contents' from source: set_fact 10661 1726776674.70789: variable 'kernel_settings_sysfs' from source: role '' defaults 10661 1726776674.70795: variable '__kernel_settings_state_empty' from source: role '' all vars 10661 1726776674.70801: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10661 1726776674.70815: variable '__sysfs_old' from source: task vars 10661 1726776674.70861: variable '__sysfs_old' from source: task vars 10661 1726776674.70996: variable 'kernel_settings_purge' from source: role '' defaults 10661 1726776674.71004: variable 'kernel_settings_sysfs' from source: role '' defaults 10661 1726776674.71009: variable '__kernel_settings_state_empty' from source: role '' all vars 10661 1726776674.71014: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10661 1726776674.71019: variable '__kernel_settings_profile_contents' from source: set_fact 10661 1726776674.71056: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 10661 1726776674.71064: variable '__systemd_old' from source: task vars 10661 1726776674.71105: variable '__systemd_old' from source: task vars 10661 1726776674.71241: variable 'kernel_settings_purge' from source: role '' defaults 10661 1726776674.71250: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 10661 1726776674.71255: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71260: variable '__kernel_settings_profile_contents' from source: set_fact 10661 1726776674.71273: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 10661 1726776674.71278: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 10661 1726776674.71282: variable '__trans_huge_old' from source: task vars 10661 1726776674.71322: variable '__trans_huge_old' from source: task vars 10661 1726776674.71455: variable 'kernel_settings_purge' from source: role '' defaults 10661 1726776674.71461: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 10661 1726776674.71466: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71472: variable '__kernel_settings_profile_contents' from source: set_fact 10661 1726776674.71482: variable '__trans_defrag_old' from source: task vars 10661 1726776674.71523: variable '__trans_defrag_old' from source: task vars 10661 1726776674.71655: variable 'kernel_settings_purge' from source: role '' defaults 10661 1726776674.71662: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 10661 1726776674.71667: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71673: variable '__kernel_settings_profile_contents' from source: set_fact 10661 1726776674.71686: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71695: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71711: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71720: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71726: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71740: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71750: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71755: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71760: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71768: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71775: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71781: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71786: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.71791: variable '__kernel_settings_state_absent' from source: role '' all vars 10661 1726776674.72212: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10661 1726776674.72258: variable 'ansible_module_compression' from source: unknown 10661 1726776674.72296: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10661 1726776674.72320: variable 'ansible_facts' from source: unknown 10661 1726776674.72382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_stat.py 10661 1726776674.72467: Sending initial data 10661 1726776674.72473: Sent initial data (151 bytes) 10661 1726776674.75064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp0zvrujnb /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_stat.py <<< 10661 1726776674.76141: stderr chunk (state=3): >>><<< 10661 1726776674.76147: stdout chunk (state=3): >>><<< 10661 1726776674.76164: done transferring module to remote 10661 1726776674.76175: _low_level_execute_command(): starting 10661 1726776674.76181: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/ /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_stat.py && sleep 0' 10661 1726776674.78693: stderr chunk (state=2): >>><<< 10661 1726776674.78699: stdout chunk (state=2): >>><<< 10661 1726776674.78712: _low_level_execute_command() done: rc=0, stdout=, stderr= 10661 1726776674.78716: _low_level_execute_command(): starting 10661 1726776674.78721: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_stat.py && sleep 0' 10661 1726776674.95476: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 8388806, "dev": 51713, "nlink": 1, "atime": 1726776634.1509304, "mtime": 1726776633.0659268, "ctime": 1726776633.3129275, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "mimetype": "text/plain", "charset": "us-ascii", "version": "737706233", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10661 1726776674.96650: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10661 1726776674.96699: stderr chunk (state=3): >>><<< 10661 1726776674.96706: stdout chunk (state=3): >>><<< 10661 1726776674.96722: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 8388806, "dev": 51713, "nlink": 1, "atime": 1726776634.1509304, "mtime": 1726776633.0659268, "ctime": 1726776633.3129275, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "mimetype": "text/plain", "charset": "us-ascii", "version": "737706233", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 10661 1726776674.96763: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10661 1726776674.96854: Sending initial data 10661 1726776674.96862: Sent initial data (159 bytes) 10661 1726776674.99437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpy5a_zizc/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source <<< 10661 1726776674.99809: stderr chunk (state=3): >>><<< 10661 1726776674.99820: stdout chunk (state=3): >>><<< 10661 1726776674.99838: _low_level_execute_command(): starting 10661 1726776674.99845: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/ /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source && sleep 0' 10661 1726776675.02243: stderr chunk (state=2): >>><<< 10661 1726776675.02255: stdout chunk (state=2): >>><<< 10661 1726776675.02271: _low_level_execute_command() done: rc=0, stdout=, stderr= 10661 1726776675.02292: variable 'ansible_module_compression' from source: unknown 10661 1726776675.02330: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 10661 1726776675.02349: variable 'ansible_facts' from source: unknown 10661 1726776675.02403: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_copy.py 10661 1726776675.02493: Sending initial data 10661 1726776675.02500: Sent initial data (151 bytes) 10661 1726776675.05050: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpv9q0_1gl /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_copy.py <<< 10661 1726776675.06180: stderr chunk (state=3): >>><<< 10661 1726776675.06189: stdout chunk (state=3): >>><<< 10661 1726776675.06209: done transferring module to remote 10661 1726776675.06219: _low_level_execute_command(): starting 10661 1726776675.06224: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/ /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_copy.py && sleep 0' 10661 1726776675.08635: stderr chunk (state=2): >>><<< 10661 1726776675.08646: stdout chunk (state=2): >>><<< 10661 1726776675.08665: _low_level_execute_command() done: rc=0, stdout=, stderr= 10661 1726776675.08670: _low_level_execute_command(): starting 10661 1726776675.08675: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/AnsiballZ_copy.py && sleep 0' 10661 1726776675.25862: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10661 1726776675.27035: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10661 1726776675.27084: stderr chunk (state=3): >>><<< 10661 1726776675.27091: stdout chunk (state=3): >>><<< 10661 1726776675.27107: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10661 1726776675.27133: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '221aa34fef95c2fe05408be9921820449785a5b2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10661 1726776675.27164: _low_level_execute_command(): starting 10661 1726776675.27172: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/ > /dev/null 2>&1 && sleep 0' 10661 1726776675.29594: stderr chunk (state=2): >>><<< 10661 1726776675.29605: stdout chunk (state=2): >>><<< 10661 1726776675.29620: _low_level_execute_command() done: rc=0, stdout=, stderr= 10661 1726776675.29632: handler run complete 10661 1726776675.29655: attempt loop complete, returning result 10661 1726776675.29662: _execute() done 10661 1726776675.29665: dumping result to json 10661 1726776675.29670: done dumping result, returning 10661 1726776675.29678: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-cec2-986e-000000000311] 10661 1726776675.29684: sending task result for task 120fa90a-8a95-cec2-986e-000000000311 10661 1726776675.29726: done sending task result for task 120fa90a-8a95-cec2-986e-000000000311 10661 1726776675.29732: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "src": "/root/.ansible/tmp/ansible-tmp-1726776674.633794-10661-168136418473788/source", "state": "file", "uid": 0 } 8218 1726776675.29911: no more pending results, returning what we have 8218 1726776675.29914: results queue empty 8218 1726776675.29914: checking for any_errors_fatal 8218 1726776675.29920: done checking for any_errors_fatal 8218 1726776675.29920: checking for max_fail_percentage 8218 1726776675.29922: done checking for max_fail_percentage 8218 1726776675.29923: checking to see if all hosts have failed and the running result is not ok 8218 1726776675.29923: done checking to see if all hosts have failed 8218 1726776675.29924: getting the remaining hosts for this loop 8218 1726776675.29925: done getting the remaining hosts for this loop 8218 1726776675.29930: getting the next task for host managed_node2 8218 1726776675.29936: done getting next task for host managed_node2 8218 1726776675.29939: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8218 1726776675.29942: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776675.29952: getting variables 8218 1726776675.29954: in VariableManager get_vars() 8218 1726776675.29986: Calling all_inventory to load vars for managed_node2 8218 1726776675.29989: Calling groups_inventory to load vars for managed_node2 8218 1726776675.29991: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776675.30000: Calling all_plugins_play to load vars for managed_node2 8218 1726776675.30002: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776675.30005: Calling groups_plugins_play to load vars for managed_node2 8218 1726776675.30118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776675.30242: done with get_vars() 8218 1726776675.30251: done getting variables 8218 1726776675.30293: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:11:15 -0400 (0:00:00.710) 0:01:01.133 **** 8218 1726776675.30319: entering _queue_task() for managed_node2/service 8218 1726776675.30490: worker is 1 (out of 1 available) 8218 1726776675.30504: exiting _queue_task() for managed_node2/service 8218 1726776675.30517: done queuing things up, now waiting for results queue to drain 8218 1726776675.30518: waiting for pending results... 10679 1726776675.30647: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 10679 1726776675.30764: in run() - task 120fa90a-8a95-cec2-986e-000000000312 10679 1726776675.30781: variable 'ansible_search_path' from source: unknown 10679 1726776675.30785: variable 'ansible_search_path' from source: unknown 10679 1726776675.30820: variable '__kernel_settings_services' from source: include_vars 10679 1726776675.31064: variable '__kernel_settings_services' from source: include_vars 10679 1726776675.31199: variable 'omit' from source: magic vars 10679 1726776675.31277: variable 'ansible_host' from source: host vars for 'managed_node2' 10679 1726776675.31286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10679 1726776675.31293: variable 'omit' from source: magic vars 10679 1726776675.31474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10679 1726776675.31645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10679 1726776675.31680: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10679 1726776675.31706: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10679 1726776675.31735: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10679 1726776675.31810: variable '__kernel_settings_register_profile' from source: set_fact 10679 1726776675.31822: variable '__kernel_settings_register_mode' from source: set_fact 10679 1726776675.31839: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 10679 1726776675.31843: when evaluation is False, skipping this task 10679 1726776675.31863: variable 'item' from source: unknown 10679 1726776675.31907: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 10679 1726776675.31934: dumping result to json 10679 1726776675.31938: done dumping result, returning 10679 1726776675.31942: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-cec2-986e-000000000312] 10679 1726776675.31947: sending task result for task 120fa90a-8a95-cec2-986e-000000000312 10679 1726776675.31963: done sending task result for task 120fa90a-8a95-cec2-986e-000000000312 10679 1726776675.31965: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8218 1726776675.32268: no more pending results, returning what we have 8218 1726776675.32271: results queue empty 8218 1726776675.32271: checking for any_errors_fatal 8218 1726776675.32279: done checking for any_errors_fatal 8218 1726776675.32280: checking for max_fail_percentage 8218 1726776675.32281: done checking for max_fail_percentage 8218 1726776675.32282: checking to see if all hosts have failed and the running result is not ok 8218 1726776675.32282: done checking to see if all hosts have failed 8218 1726776675.32283: getting the remaining hosts for this loop 8218 1726776675.32284: done getting the remaining hosts for this loop 8218 1726776675.32286: getting the next task for host managed_node2 8218 1726776675.32290: done getting next task for host managed_node2 8218 1726776675.32292: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8218 1726776675.32294: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776675.32304: getting variables 8218 1726776675.32305: in VariableManager get_vars() 8218 1726776675.32328: Calling all_inventory to load vars for managed_node2 8218 1726776675.32332: Calling groups_inventory to load vars for managed_node2 8218 1726776675.32333: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776675.32342: Calling all_plugins_play to load vars for managed_node2 8218 1726776675.32344: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776675.32346: Calling groups_plugins_play to load vars for managed_node2 8218 1726776675.32452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776675.32566: done with get_vars() 8218 1726776675.32574: done getting variables 8218 1726776675.32615: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:11:15 -0400 (0:00:00.023) 0:01:01.156 **** 8218 1726776675.32639: entering _queue_task() for managed_node2/command 8218 1726776675.32797: worker is 1 (out of 1 available) 8218 1726776675.32810: exiting _queue_task() for managed_node2/command 8218 1726776675.32822: done queuing things up, now waiting for results queue to drain 8218 1726776675.32824: waiting for pending results... 10680 1726776675.32947: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 10680 1726776675.33051: in run() - task 120fa90a-8a95-cec2-986e-000000000313 10680 1726776675.33066: variable 'ansible_search_path' from source: unknown 10680 1726776675.33070: variable 'ansible_search_path' from source: unknown 10680 1726776675.33096: calling self._execute() 10680 1726776675.33161: variable 'ansible_host' from source: host vars for 'managed_node2' 10680 1726776675.33171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10680 1726776675.33179: variable 'omit' from source: magic vars 10680 1726776675.33500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10680 1726776675.33775: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10680 1726776675.33810: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10680 1726776675.33837: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10680 1726776675.33866: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10680 1726776675.33951: variable '__kernel_settings_register_profile' from source: set_fact 10680 1726776675.33973: Evaluated conditional (not __kernel_settings_register_profile is changed): True 10680 1726776675.34066: variable '__kernel_settings_register_mode' from source: set_fact 10680 1726776675.34077: Evaluated conditional (not __kernel_settings_register_mode is changed): True 10680 1726776675.34157: variable '__kernel_settings_register_apply' from source: set_fact 10680 1726776675.34167: Evaluated conditional (__kernel_settings_register_apply is changed): True 10680 1726776675.34174: variable 'omit' from source: magic vars 10680 1726776675.34206: variable 'omit' from source: magic vars 10680 1726776675.34291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10680 1726776675.35722: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10680 1726776675.35781: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10680 1726776675.35810: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10680 1726776675.35836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10680 1726776675.35859: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10680 1726776675.35914: variable '__kernel_settings_active_profile' from source: set_fact 10680 1726776675.35944: variable 'omit' from source: magic vars 10680 1726776675.35970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10680 1726776675.35992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10680 1726776675.36008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10680 1726776675.36021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10680 1726776675.36032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10680 1726776675.36057: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10680 1726776675.36062: variable 'ansible_host' from source: host vars for 'managed_node2' 10680 1726776675.36066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10680 1726776675.36132: Set connection var ansible_connection to ssh 10680 1726776675.36138: Set connection var ansible_pipelining to False 10680 1726776675.36142: Set connection var ansible_timeout to 10 10680 1726776675.36147: Set connection var ansible_module_compression to ZIP_DEFLATED 10680 1726776675.36152: Set connection var ansible_shell_type to sh 10680 1726776675.36155: Set connection var ansible_shell_executable to /bin/sh 10680 1726776675.36169: variable 'ansible_shell_executable' from source: unknown 10680 1726776675.36172: variable 'ansible_connection' from source: unknown 10680 1726776675.36173: variable 'ansible_module_compression' from source: unknown 10680 1726776675.36175: variable 'ansible_shell_type' from source: unknown 10680 1726776675.36178: variable 'ansible_shell_executable' from source: unknown 10680 1726776675.36180: variable 'ansible_host' from source: host vars for 'managed_node2' 10680 1726776675.36182: variable 'ansible_pipelining' from source: unknown 10680 1726776675.36184: variable 'ansible_timeout' from source: unknown 10680 1726776675.36186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10680 1726776675.36264: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10680 1726776675.36275: variable 'omit' from source: magic vars 10680 1726776675.36280: starting attempt loop 10680 1726776675.36283: running the handler 10680 1726776675.36295: _low_level_execute_command(): starting 10680 1726776675.36302: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10680 1726776675.38619: stdout chunk (state=2): >>>/root <<< 10680 1726776675.38740: stderr chunk (state=3): >>><<< 10680 1726776675.38748: stdout chunk (state=3): >>><<< 10680 1726776675.38774: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10680 1726776675.38786: _low_level_execute_command(): starting 10680 1726776675.38792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022 `" && echo ansible-tmp-1726776675.3878229-10680-262096668777022="` echo /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022 `" ) && sleep 0' 10680 1726776675.41482: stdout chunk (state=2): >>>ansible-tmp-1726776675.3878229-10680-262096668777022=/root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022 <<< 10680 1726776675.41612: stderr chunk (state=3): >>><<< 10680 1726776675.41619: stdout chunk (state=3): >>><<< 10680 1726776675.41638: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776675.3878229-10680-262096668777022=/root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022 , stderr= 10680 1726776675.41667: variable 'ansible_module_compression' from source: unknown 10680 1726776675.41704: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10680 1726776675.41736: variable 'ansible_facts' from source: unknown 10680 1726776675.41810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022/AnsiballZ_command.py 10680 1726776675.41915: Sending initial data 10680 1726776675.41922: Sent initial data (155 bytes) 10680 1726776675.44443: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpmx13iw0k /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022/AnsiballZ_command.py <<< 10680 1726776675.45509: stderr chunk (state=3): >>><<< 10680 1726776675.45519: stdout chunk (state=3): >>><<< 10680 1726776675.45540: done transferring module to remote 10680 1726776675.45554: _low_level_execute_command(): starting 10680 1726776675.45560: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022/ /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022/AnsiballZ_command.py && sleep 0' 10680 1726776675.47904: stderr chunk (state=2): >>><<< 10680 1726776675.47914: stdout chunk (state=2): >>><<< 10680 1726776675.47931: _low_level_execute_command() done: rc=0, stdout=, stderr= 10680 1726776675.47936: _low_level_execute_command(): starting 10680 1726776675.47941: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022/AnsiballZ_command.py && sleep 0' 10680 1726776676.78971: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:15.628217", "end": "2024-09-19 16:11:16.785061", "delta": "0:00:01.156844", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10680 1726776676.80094: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10680 1726776676.80144: stderr chunk (state=3): >>><<< 10680 1726776676.80154: stdout chunk (state=3): >>><<< 10680 1726776676.80172: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:15.628217", "end": "2024-09-19 16:11:16.785061", "delta": "0:00:01.156844", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10680 1726776676.80202: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10680 1726776676.80212: _low_level_execute_command(): starting 10680 1726776676.80219: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776675.3878229-10680-262096668777022/ > /dev/null 2>&1 && sleep 0' 10680 1726776676.82749: stderr chunk (state=2): >>><<< 10680 1726776676.82765: stdout chunk (state=2): >>><<< 10680 1726776676.82782: _low_level_execute_command() done: rc=0, stdout=, stderr= 10680 1726776676.82790: handler run complete 10680 1726776676.82807: Evaluated conditional (True): True 10680 1726776676.82816: attempt loop complete, returning result 10680 1726776676.82819: _execute() done 10680 1726776676.82822: dumping result to json 10680 1726776676.82827: done dumping result, returning 10680 1726776676.82836: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-cec2-986e-000000000313] 10680 1726776676.82842: sending task result for task 120fa90a-8a95-cec2-986e-000000000313 10680 1726776676.82874: done sending task result for task 120fa90a-8a95-cec2-986e-000000000313 10680 1726776676.82879: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.156844", "end": "2024-09-19 16:11:16.785061", "rc": 0, "start": "2024-09-19 16:11:15.628217" } 8218 1726776676.83041: no more pending results, returning what we have 8218 1726776676.83044: results queue empty 8218 1726776676.83045: checking for any_errors_fatal 8218 1726776676.83052: done checking for any_errors_fatal 8218 1726776676.83053: checking for max_fail_percentage 8218 1726776676.83055: done checking for max_fail_percentage 8218 1726776676.83056: checking to see if all hosts have failed and the running result is not ok 8218 1726776676.83057: done checking to see if all hosts have failed 8218 1726776676.83057: getting the remaining hosts for this loop 8218 1726776676.83060: done getting the remaining hosts for this loop 8218 1726776676.83063: getting the next task for host managed_node2 8218 1726776676.83069: done getting next task for host managed_node2 8218 1726776676.83072: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8218 1726776676.83074: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776676.83085: getting variables 8218 1726776676.83087: in VariableManager get_vars() 8218 1726776676.83121: Calling all_inventory to load vars for managed_node2 8218 1726776676.83124: Calling groups_inventory to load vars for managed_node2 8218 1726776676.83126: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776676.83136: Calling all_plugins_play to load vars for managed_node2 8218 1726776676.83139: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776676.83142: Calling groups_plugins_play to load vars for managed_node2 8218 1726776676.83254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776676.83443: done with get_vars() 8218 1726776676.83452: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:11:16 -0400 (0:00:01.508) 0:01:02.665 **** 8218 1726776676.83519: entering _queue_task() for managed_node2/include_tasks 8218 1726776676.83694: worker is 1 (out of 1 available) 8218 1726776676.83708: exiting _queue_task() for managed_node2/include_tasks 8218 1726776676.83720: done queuing things up, now waiting for results queue to drain 8218 1726776676.83722: waiting for pending results... 10705 1726776676.83859: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 10705 1726776676.83977: in run() - task 120fa90a-8a95-cec2-986e-000000000314 10705 1726776676.83995: variable 'ansible_search_path' from source: unknown 10705 1726776676.83999: variable 'ansible_search_path' from source: unknown 10705 1726776676.84030: calling self._execute() 10705 1726776676.84102: variable 'ansible_host' from source: host vars for 'managed_node2' 10705 1726776676.84112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10705 1726776676.84123: variable 'omit' from source: magic vars 10705 1726776676.84466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10705 1726776676.84660: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10705 1726776676.84697: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10705 1726776676.84727: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10705 1726776676.84759: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10705 1726776676.84854: variable '__kernel_settings_register_apply' from source: set_fact 10705 1726776676.84878: Evaluated conditional (__kernel_settings_register_apply is changed): True 10705 1726776676.84886: _execute() done 10705 1726776676.84890: dumping result to json 10705 1726776676.84894: done dumping result, returning 10705 1726776676.84901: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-cec2-986e-000000000314] 10705 1726776676.84908: sending task result for task 120fa90a-8a95-cec2-986e-000000000314 10705 1726776676.84936: done sending task result for task 120fa90a-8a95-cec2-986e-000000000314 10705 1726776676.84940: WORKER PROCESS EXITING 8218 1726776676.85058: no more pending results, returning what we have 8218 1726776676.85063: in VariableManager get_vars() 8218 1726776676.85102: Calling all_inventory to load vars for managed_node2 8218 1726776676.85105: Calling groups_inventory to load vars for managed_node2 8218 1726776676.85106: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776676.85116: Calling all_plugins_play to load vars for managed_node2 8218 1726776676.85119: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776676.85121: Calling groups_plugins_play to load vars for managed_node2 8218 1726776676.85256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776676.85377: done with get_vars() 8218 1726776676.85384: variable 'ansible_search_path' from source: unknown 8218 1726776676.85384: variable 'ansible_search_path' from source: unknown 8218 1726776676.85411: we have included files to process 8218 1726776676.85412: generating all_blocks data 8218 1726776676.85415: done generating all_blocks data 8218 1726776676.85419: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776676.85420: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776676.85422: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8218 1726776676.85688: done processing included file 8218 1726776676.85690: iterating over new_blocks loaded from include file 8218 1726776676.85691: in VariableManager get_vars() 8218 1726776676.85708: done with get_vars() 8218 1726776676.85709: filtering new block on tags 8218 1726776676.85745: done filtering new block on tags 8218 1726776676.85747: done iterating over new_blocks loaded from include file 8218 1726776676.85747: extending task lists for all hosts with included blocks 8218 1726776676.86134: done extending task lists 8218 1726776676.86135: done processing included files 8218 1726776676.86135: results queue empty 8218 1726776676.86136: checking for any_errors_fatal 8218 1726776676.86139: done checking for any_errors_fatal 8218 1726776676.86139: checking for max_fail_percentage 8218 1726776676.86140: done checking for max_fail_percentage 8218 1726776676.86141: checking to see if all hosts have failed and the running result is not ok 8218 1726776676.86141: done checking to see if all hosts have failed 8218 1726776676.86141: getting the remaining hosts for this loop 8218 1726776676.86142: done getting the remaining hosts for this loop 8218 1726776676.86144: getting the next task for host managed_node2 8218 1726776676.86146: done getting next task for host managed_node2 8218 1726776676.86149: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8218 1726776676.86151: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776676.86159: getting variables 8218 1726776676.86160: in VariableManager get_vars() 8218 1726776676.86169: Calling all_inventory to load vars for managed_node2 8218 1726776676.86171: Calling groups_inventory to load vars for managed_node2 8218 1726776676.86172: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776676.86176: Calling all_plugins_play to load vars for managed_node2 8218 1726776676.86177: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776676.86178: Calling groups_plugins_play to load vars for managed_node2 8218 1726776676.86260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776676.86371: done with get_vars() 8218 1726776676.86379: done getting variables 8218 1726776676.86404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 16:11:16 -0400 (0:00:00.029) 0:01:02.694 **** 8218 1726776676.86427: entering _queue_task() for managed_node2/command 8218 1726776676.86615: worker is 1 (out of 1 available) 8218 1726776676.86632: exiting _queue_task() for managed_node2/command 8218 1726776676.86643: done queuing things up, now waiting for results queue to drain 8218 1726776676.86645: waiting for pending results... 10706 1726776676.86774: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 10706 1726776676.86897: in run() - task 120fa90a-8a95-cec2-986e-000000000483 10706 1726776676.86913: variable 'ansible_search_path' from source: unknown 10706 1726776676.86917: variable 'ansible_search_path' from source: unknown 10706 1726776676.86947: calling self._execute() 10706 1726776676.87015: variable 'ansible_host' from source: host vars for 'managed_node2' 10706 1726776676.87024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10706 1726776676.87034: variable 'omit' from source: magic vars 10706 1726776676.87113: variable 'omit' from source: magic vars 10706 1726776676.87161: variable 'omit' from source: magic vars 10706 1726776676.87183: variable 'omit' from source: magic vars 10706 1726776676.87216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10706 1726776676.87244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10706 1726776676.87265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10706 1726776676.87279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10706 1726776676.87289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10706 1726776676.87312: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10706 1726776676.87316: variable 'ansible_host' from source: host vars for 'managed_node2' 10706 1726776676.87319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10706 1726776676.87386: Set connection var ansible_connection to ssh 10706 1726776676.87392: Set connection var ansible_pipelining to False 10706 1726776676.87396: Set connection var ansible_timeout to 10 10706 1726776676.87401: Set connection var ansible_module_compression to ZIP_DEFLATED 10706 1726776676.87404: Set connection var ansible_shell_type to sh 10706 1726776676.87409: Set connection var ansible_shell_executable to /bin/sh 10706 1726776676.87423: variable 'ansible_shell_executable' from source: unknown 10706 1726776676.87426: variable 'ansible_connection' from source: unknown 10706 1726776676.87430: variable 'ansible_module_compression' from source: unknown 10706 1726776676.87432: variable 'ansible_shell_type' from source: unknown 10706 1726776676.87434: variable 'ansible_shell_executable' from source: unknown 10706 1726776676.87435: variable 'ansible_host' from source: host vars for 'managed_node2' 10706 1726776676.87437: variable 'ansible_pipelining' from source: unknown 10706 1726776676.87439: variable 'ansible_timeout' from source: unknown 10706 1726776676.87441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10706 1726776676.87536: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10706 1726776676.87549: variable 'omit' from source: magic vars 10706 1726776676.87558: starting attempt loop 10706 1726776676.87561: running the handler 10706 1726776676.87577: _low_level_execute_command(): starting 10706 1726776676.87585: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10706 1726776676.90033: stdout chunk (state=2): >>>/root <<< 10706 1726776676.90148: stderr chunk (state=3): >>><<< 10706 1726776676.90156: stdout chunk (state=3): >>><<< 10706 1726776676.90173: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10706 1726776676.90186: _low_level_execute_command(): starting 10706 1726776676.90193: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872 `" && echo ansible-tmp-1726776676.9018183-10706-186746351604872="` echo /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872 `" ) && sleep 0' 10706 1726776676.93135: stdout chunk (state=2): >>>ansible-tmp-1726776676.9018183-10706-186746351604872=/root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872 <<< 10706 1726776676.93148: stderr chunk (state=2): >>><<< 10706 1726776676.93160: stdout chunk (state=3): >>><<< 10706 1726776676.93174: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776676.9018183-10706-186746351604872=/root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872 , stderr= 10706 1726776676.93203: variable 'ansible_module_compression' from source: unknown 10706 1726776676.93263: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10706 1726776676.93300: variable 'ansible_facts' from source: unknown 10706 1726776676.93406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872/AnsiballZ_command.py 10706 1726776676.93761: Sending initial data 10706 1726776676.93768: Sent initial data (155 bytes) 10706 1726776676.97045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpmpjhtovs /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872/AnsiballZ_command.py <<< 10706 1726776676.98431: stderr chunk (state=3): >>><<< 10706 1726776676.98439: stdout chunk (state=3): >>><<< 10706 1726776676.98460: done transferring module to remote 10706 1726776676.98471: _low_level_execute_command(): starting 10706 1726776676.98476: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872/ /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872/AnsiballZ_command.py && sleep 0' 10706 1726776677.00860: stderr chunk (state=2): >>><<< 10706 1726776677.00868: stdout chunk (state=2): >>><<< 10706 1726776677.00882: _low_level_execute_command() done: rc=0, stdout=, stderr= 10706 1726776677.00886: _low_level_execute_command(): starting 10706 1726776677.00891: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872/AnsiballZ_command.py && sleep 0' 10706 1726776677.27844: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:17.160927", "end": "2024-09-19 16:11:17.273603", "delta": "0:00:00.112676", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10706 1726776677.28730: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10706 1726776677.28776: stderr chunk (state=3): >>><<< 10706 1726776677.28783: stdout chunk (state=3): >>><<< 10706 1726776677.28798: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:17.160927", "end": "2024-09-19 16:11:17.273603", "delta": "0:00:00.112676", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10706 1726776677.28843: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10706 1726776677.28854: _low_level_execute_command(): starting 10706 1726776677.28862: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776676.9018183-10706-186746351604872/ > /dev/null 2>&1 && sleep 0' 10706 1726776677.31265: stderr chunk (state=2): >>><<< 10706 1726776677.31272: stdout chunk (state=2): >>><<< 10706 1726776677.31286: _low_level_execute_command() done: rc=0, stdout=, stderr= 10706 1726776677.31293: handler run complete 10706 1726776677.31313: Evaluated conditional (False): False 10706 1726776677.31322: attempt loop complete, returning result 10706 1726776677.31326: _execute() done 10706 1726776677.31330: dumping result to json 10706 1726776677.31336: done dumping result, returning 10706 1726776677.31344: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [120fa90a-8a95-cec2-986e-000000000483] 10706 1726776677.31350: sending task result for task 120fa90a-8a95-cec2-986e-000000000483 10706 1726776677.31382: done sending task result for task 120fa90a-8a95-cec2-986e-000000000483 10706 1726776677.31386: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.112676", "end": "2024-09-19 16:11:17.273603", "rc": 0, "start": "2024-09-19 16:11:17.160927" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8218 1726776677.31541: no more pending results, returning what we have 8218 1726776677.31545: results queue empty 8218 1726776677.31545: checking for any_errors_fatal 8218 1726776677.31547: done checking for any_errors_fatal 8218 1726776677.31548: checking for max_fail_percentage 8218 1726776677.31549: done checking for max_fail_percentage 8218 1726776677.31550: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.31551: done checking to see if all hosts have failed 8218 1726776677.31551: getting the remaining hosts for this loop 8218 1726776677.31554: done getting the remaining hosts for this loop 8218 1726776677.31558: getting the next task for host managed_node2 8218 1726776677.31564: done getting next task for host managed_node2 8218 1726776677.31566: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8218 1726776677.31569: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.31580: getting variables 8218 1726776677.31581: in VariableManager get_vars() 8218 1726776677.31615: Calling all_inventory to load vars for managed_node2 8218 1726776677.31618: Calling groups_inventory to load vars for managed_node2 8218 1726776677.31620: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.31627: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.31631: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.31633: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.31745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.31895: done with get_vars() 8218 1726776677.31903: done getting variables 8218 1726776677.31950: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.455) 0:01:03.150 **** 8218 1726776677.31976: entering _queue_task() for managed_node2/shell 8218 1726776677.32131: worker is 1 (out of 1 available) 8218 1726776677.32145: exiting _queue_task() for managed_node2/shell 8218 1726776677.32158: done queuing things up, now waiting for results queue to drain 8218 1726776677.32160: waiting for pending results... 10725 1726776677.32282: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 10725 1726776677.32410: in run() - task 120fa90a-8a95-cec2-986e-000000000484 10725 1726776677.32438: variable 'ansible_search_path' from source: unknown 10725 1726776677.32443: variable 'ansible_search_path' from source: unknown 10725 1726776677.32471: calling self._execute() 10725 1726776677.32533: variable 'ansible_host' from source: host vars for 'managed_node2' 10725 1726776677.32541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10725 1726776677.32550: variable 'omit' from source: magic vars 10725 1726776677.32910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10725 1726776677.33151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10725 1726776677.33194: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10725 1726776677.33227: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10725 1726776677.33263: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10725 1726776677.33364: variable '__kernel_settings_register_verify_values' from source: set_fact 10725 1726776677.33391: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10725 1726776677.33396: when evaluation is False, skipping this task 10725 1726776677.33399: _execute() done 10725 1726776677.33402: dumping result to json 10725 1726776677.33405: done dumping result, returning 10725 1726776677.33411: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [120fa90a-8a95-cec2-986e-000000000484] 10725 1726776677.33418: sending task result for task 120fa90a-8a95-cec2-986e-000000000484 10725 1726776677.33448: done sending task result for task 120fa90a-8a95-cec2-986e-000000000484 10725 1726776677.33452: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776677.33669: no more pending results, returning what we have 8218 1726776677.33672: results queue empty 8218 1726776677.33673: checking for any_errors_fatal 8218 1726776677.33682: done checking for any_errors_fatal 8218 1726776677.33683: checking for max_fail_percentage 8218 1726776677.33684: done checking for max_fail_percentage 8218 1726776677.33685: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.33686: done checking to see if all hosts have failed 8218 1726776677.33686: getting the remaining hosts for this loop 8218 1726776677.33688: done getting the remaining hosts for this loop 8218 1726776677.33691: getting the next task for host managed_node2 8218 1726776677.33697: done getting next task for host managed_node2 8218 1726776677.33700: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8218 1726776677.33703: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.33718: getting variables 8218 1726776677.33720: in VariableManager get_vars() 8218 1726776677.33757: Calling all_inventory to load vars for managed_node2 8218 1726776677.33760: Calling groups_inventory to load vars for managed_node2 8218 1726776677.33762: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.33771: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.33774: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.33777: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.33943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.34142: done with get_vars() 8218 1726776677.34150: done getting variables 8218 1726776677.34193: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.022) 0:01:03.172 **** 8218 1726776677.34222: entering _queue_task() for managed_node2/fail 8218 1726776677.34379: worker is 1 (out of 1 available) 8218 1726776677.34392: exiting _queue_task() for managed_node2/fail 8218 1726776677.34402: done queuing things up, now waiting for results queue to drain 8218 1726776677.34405: waiting for pending results... 10728 1726776677.34522: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 10728 1726776677.34635: in run() - task 120fa90a-8a95-cec2-986e-000000000485 10728 1726776677.34650: variable 'ansible_search_path' from source: unknown 10728 1726776677.34654: variable 'ansible_search_path' from source: unknown 10728 1726776677.34678: calling self._execute() 10728 1726776677.34736: variable 'ansible_host' from source: host vars for 'managed_node2' 10728 1726776677.34743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10728 1726776677.34748: variable 'omit' from source: magic vars 10728 1726776677.35057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10728 1726776677.35279: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10728 1726776677.35317: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10728 1726776677.35344: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10728 1726776677.35373: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10728 1726776677.35465: variable '__kernel_settings_register_verify_values' from source: set_fact 10728 1726776677.35487: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10728 1726776677.35491: when evaluation is False, skipping this task 10728 1726776677.35495: _execute() done 10728 1726776677.35498: dumping result to json 10728 1726776677.35501: done dumping result, returning 10728 1726776677.35507: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [120fa90a-8a95-cec2-986e-000000000485] 10728 1726776677.35514: sending task result for task 120fa90a-8a95-cec2-986e-000000000485 10728 1726776677.35539: done sending task result for task 120fa90a-8a95-cec2-986e-000000000485 10728 1726776677.35543: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776677.35659: no more pending results, returning what we have 8218 1726776677.35662: results queue empty 8218 1726776677.35663: checking for any_errors_fatal 8218 1726776677.35668: done checking for any_errors_fatal 8218 1726776677.35668: checking for max_fail_percentage 8218 1726776677.35669: done checking for max_fail_percentage 8218 1726776677.35670: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.35671: done checking to see if all hosts have failed 8218 1726776677.35671: getting the remaining hosts for this loop 8218 1726776677.35672: done getting the remaining hosts for this loop 8218 1726776677.35676: getting the next task for host managed_node2 8218 1726776677.35682: done getting next task for host managed_node2 8218 1726776677.35684: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8218 1726776677.35687: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.35701: getting variables 8218 1726776677.35702: in VariableManager get_vars() 8218 1726776677.35731: Calling all_inventory to load vars for managed_node2 8218 1726776677.35734: Calling groups_inventory to load vars for managed_node2 8218 1726776677.35735: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.35741: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.35743: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.35744: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.35848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.36042: done with get_vars() 8218 1726776677.36051: done getting variables 8218 1726776677.36104: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.019) 0:01:03.191 **** 8218 1726776677.36131: entering _queue_task() for managed_node2/set_fact 8218 1726776677.36311: worker is 1 (out of 1 available) 8218 1726776677.36326: exiting _queue_task() for managed_node2/set_fact 8218 1726776677.36339: done queuing things up, now waiting for results queue to drain 8218 1726776677.36341: waiting for pending results... 10729 1726776677.36573: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 10729 1726776677.36704: in run() - task 120fa90a-8a95-cec2-986e-000000000315 10729 1726776677.36722: variable 'ansible_search_path' from source: unknown 10729 1726776677.36726: variable 'ansible_search_path' from source: unknown 10729 1726776677.36759: calling self._execute() 10729 1726776677.36840: variable 'ansible_host' from source: host vars for 'managed_node2' 10729 1726776677.36849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10729 1726776677.36859: variable 'omit' from source: magic vars 10729 1726776677.36958: variable 'omit' from source: magic vars 10729 1726776677.36999: variable 'omit' from source: magic vars 10729 1726776677.37024: variable 'omit' from source: magic vars 10729 1726776677.37064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10729 1726776677.37094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10729 1726776677.37111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10729 1726776677.37133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10729 1726776677.37145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10729 1726776677.37173: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10729 1726776677.37179: variable 'ansible_host' from source: host vars for 'managed_node2' 10729 1726776677.37183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10729 1726776677.37268: Set connection var ansible_connection to ssh 10729 1726776677.37275: Set connection var ansible_pipelining to False 10729 1726776677.37279: Set connection var ansible_timeout to 10 10729 1726776677.37284: Set connection var ansible_module_compression to ZIP_DEFLATED 10729 1726776677.37287: Set connection var ansible_shell_type to sh 10729 1726776677.37290: Set connection var ansible_shell_executable to /bin/sh 10729 1726776677.37303: variable 'ansible_shell_executable' from source: unknown 10729 1726776677.37307: variable 'ansible_connection' from source: unknown 10729 1726776677.37309: variable 'ansible_module_compression' from source: unknown 10729 1726776677.37311: variable 'ansible_shell_type' from source: unknown 10729 1726776677.37313: variable 'ansible_shell_executable' from source: unknown 10729 1726776677.37314: variable 'ansible_host' from source: host vars for 'managed_node2' 10729 1726776677.37316: variable 'ansible_pipelining' from source: unknown 10729 1726776677.37318: variable 'ansible_timeout' from source: unknown 10729 1726776677.37320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10729 1726776677.37535: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10729 1726776677.37549: variable 'omit' from source: magic vars 10729 1726776677.37558: starting attempt loop 10729 1726776677.37562: running the handler 10729 1726776677.37572: handler run complete 10729 1726776677.37583: attempt loop complete, returning result 10729 1726776677.37587: _execute() done 10729 1726776677.37590: dumping result to json 10729 1726776677.37593: done dumping result, returning 10729 1726776677.37599: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-000000000315] 10729 1726776677.37606: sending task result for task 120fa90a-8a95-cec2-986e-000000000315 10729 1726776677.37633: done sending task result for task 120fa90a-8a95-cec2-986e-000000000315 10729 1726776677.37637: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8218 1726776677.37989: no more pending results, returning what we have 8218 1726776677.37992: results queue empty 8218 1726776677.37993: checking for any_errors_fatal 8218 1726776677.37999: done checking for any_errors_fatal 8218 1726776677.38000: checking for max_fail_percentage 8218 1726776677.38002: done checking for max_fail_percentage 8218 1726776677.38002: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.38003: done checking to see if all hosts have failed 8218 1726776677.38004: getting the remaining hosts for this loop 8218 1726776677.38005: done getting the remaining hosts for this loop 8218 1726776677.38008: getting the next task for host managed_node2 8218 1726776677.38014: done getting next task for host managed_node2 8218 1726776677.38016: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8218 1726776677.38018: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.38030: getting variables 8218 1726776677.38032: in VariableManager get_vars() 8218 1726776677.38062: Calling all_inventory to load vars for managed_node2 8218 1726776677.38065: Calling groups_inventory to load vars for managed_node2 8218 1726776677.38066: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.38072: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.38074: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.38075: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.38182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.38297: done with get_vars() 8218 1726776677.38304: done getting variables 8218 1726776677.38344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.022) 0:01:03.214 **** 8218 1726776677.38367: entering _queue_task() for managed_node2/set_fact 8218 1726776677.38506: worker is 1 (out of 1 available) 8218 1726776677.38519: exiting _queue_task() for managed_node2/set_fact 8218 1726776677.38532: done queuing things up, now waiting for results queue to drain 8218 1726776677.38534: waiting for pending results... 10731 1726776677.38644: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 10731 1726776677.38741: in run() - task 120fa90a-8a95-cec2-986e-000000000316 10731 1726776677.38755: variable 'ansible_search_path' from source: unknown 10731 1726776677.38760: variable 'ansible_search_path' from source: unknown 10731 1726776677.38783: calling self._execute() 10731 1726776677.38840: variable 'ansible_host' from source: host vars for 'managed_node2' 10731 1726776677.38847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10731 1726776677.38852: variable 'omit' from source: magic vars 10731 1726776677.38919: variable 'omit' from source: magic vars 10731 1726776677.38952: variable 'omit' from source: magic vars 10731 1726776677.39205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10731 1726776677.39419: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10731 1726776677.39453: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10731 1726776677.39477: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10731 1726776677.39504: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10731 1726776677.39598: variable '__kernel_settings_register_profile' from source: set_fact 10731 1726776677.39611: variable '__kernel_settings_register_mode' from source: set_fact 10731 1726776677.39619: variable '__kernel_settings_register_apply' from source: set_fact 10731 1726776677.39658: variable 'omit' from source: magic vars 10731 1726776677.39679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10731 1726776677.39699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10731 1726776677.39714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10731 1726776677.39727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10731 1726776677.39738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10731 1726776677.39761: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10731 1726776677.39766: variable 'ansible_host' from source: host vars for 'managed_node2' 10731 1726776677.39771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10731 1726776677.39850: Set connection var ansible_connection to ssh 10731 1726776677.39858: Set connection var ansible_pipelining to False 10731 1726776677.39863: Set connection var ansible_timeout to 10 10731 1726776677.39867: Set connection var ansible_module_compression to ZIP_DEFLATED 10731 1726776677.39871: Set connection var ansible_shell_type to sh 10731 1726776677.39873: Set connection var ansible_shell_executable to /bin/sh 10731 1726776677.39885: variable 'ansible_shell_executable' from source: unknown 10731 1726776677.39888: variable 'ansible_connection' from source: unknown 10731 1726776677.39890: variable 'ansible_module_compression' from source: unknown 10731 1726776677.39891: variable 'ansible_shell_type' from source: unknown 10731 1726776677.39893: variable 'ansible_shell_executable' from source: unknown 10731 1726776677.39894: variable 'ansible_host' from source: host vars for 'managed_node2' 10731 1726776677.39896: variable 'ansible_pipelining' from source: unknown 10731 1726776677.39898: variable 'ansible_timeout' from source: unknown 10731 1726776677.39900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10731 1726776677.39966: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10731 1726776677.39976: variable 'omit' from source: magic vars 10731 1726776677.39980: starting attempt loop 10731 1726776677.39982: running the handler 10731 1726776677.39989: handler run complete 10731 1726776677.39995: attempt loop complete, returning result 10731 1726776677.39996: _execute() done 10731 1726776677.39998: dumping result to json 10731 1726776677.40000: done dumping result, returning 10731 1726776677.40004: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-cec2-986e-000000000316] 10731 1726776677.40008: sending task result for task 120fa90a-8a95-cec2-986e-000000000316 10731 1726776677.40023: done sending task result for task 120fa90a-8a95-cec2-986e-000000000316 10731 1726776677.40025: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8218 1726776677.40310: no more pending results, returning what we have 8218 1726776677.40313: results queue empty 8218 1726776677.40314: checking for any_errors_fatal 8218 1726776677.40318: done checking for any_errors_fatal 8218 1726776677.40319: checking for max_fail_percentage 8218 1726776677.40320: done checking for max_fail_percentage 8218 1726776677.40321: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.40322: done checking to see if all hosts have failed 8218 1726776677.40322: getting the remaining hosts for this loop 8218 1726776677.40324: done getting the remaining hosts for this loop 8218 1726776677.40327: getting the next task for host managed_node2 8218 1726776677.40337: done getting next task for host managed_node2 8218 1726776677.40339: ^ task is: TASK: meta (role_complete) 8218 1726776677.40342: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.40353: getting variables 8218 1726776677.40354: in VariableManager get_vars() 8218 1726776677.40388: Calling all_inventory to load vars for managed_node2 8218 1726776677.40392: Calling groups_inventory to load vars for managed_node2 8218 1726776677.40394: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.40402: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.40404: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.40407: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.40566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.40809: done with get_vars() 8218 1726776677.40819: done getting variables 8218 1726776677.40878: done queuing things up, now waiting for results queue to drain 8218 1726776677.40879: results queue empty 8218 1726776677.40880: checking for any_errors_fatal 8218 1726776677.40882: done checking for any_errors_fatal 8218 1726776677.40883: checking for max_fail_percentage 8218 1726776677.40883: done checking for max_fail_percentage 8218 1726776677.40886: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.40887: done checking to see if all hosts have failed 8218 1726776677.40887: getting the remaining hosts for this loop 8218 1726776677.40888: done getting the remaining hosts for this loop 8218 1726776677.40889: getting the next task for host managed_node2 8218 1726776677.40891: done getting next task for host managed_node2 8218 1726776677.40892: ^ task is: TASK: meta (flush_handlers) 8218 1726776677.40892: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.40895: getting variables 8218 1726776677.40896: in VariableManager get_vars() 8218 1726776677.40902: Calling all_inventory to load vars for managed_node2 8218 1726776677.40903: Calling groups_inventory to load vars for managed_node2 8218 1726776677.40904: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.40907: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.40908: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.40910: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.40985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.41099: done with get_vars() 8218 1726776677.41104: done getting variables TASK [Force handlers] ********************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:130 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.027) 0:01:03.242 **** 8218 1726776677.41148: in VariableManager get_vars() 8218 1726776677.41155: Calling all_inventory to load vars for managed_node2 8218 1726776677.41157: Calling groups_inventory to load vars for managed_node2 8218 1726776677.41159: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.41162: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.41163: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.41164: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.41244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.41343: done with get_vars() META: triggered running handlers for managed_node2 8218 1726776677.41353: done queuing things up, now waiting for results queue to drain 8218 1726776677.41354: results queue empty 8218 1726776677.41355: checking for any_errors_fatal 8218 1726776677.41356: done checking for any_errors_fatal 8218 1726776677.41356: checking for max_fail_percentage 8218 1726776677.41357: done checking for max_fail_percentage 8218 1726776677.41357: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.41357: done checking to see if all hosts have failed 8218 1726776677.41358: getting the remaining hosts for this loop 8218 1726776677.41358: done getting the remaining hosts for this loop 8218 1726776677.41360: getting the next task for host managed_node2 8218 1726776677.41362: done getting next task for host managed_node2 8218 1726776677.41363: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8218 1726776677.41363: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.41365: getting variables 8218 1726776677.41365: in VariableManager get_vars() 8218 1726776677.41372: Calling all_inventory to load vars for managed_node2 8218 1726776677.41374: Calling groups_inventory to load vars for managed_node2 8218 1726776677.41375: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.41378: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.41379: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.41381: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.41476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.41575: done with get_vars() 8218 1726776677.41581: done getting variables 8218 1726776677.41605: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:133 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.004) 0:01:03.246 **** 8218 1726776677.41617: entering _queue_task() for managed_node2/assert 8218 1726776677.41759: worker is 1 (out of 1 available) 8218 1726776677.41772: exiting _queue_task() for managed_node2/assert 8218 1726776677.41783: done queuing things up, now waiting for results queue to drain 8218 1726776677.41785: waiting for pending results... 10733 1726776677.41903: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 10733 1726776677.41992: in run() - task 120fa90a-8a95-cec2-986e-00000000001c 10733 1726776677.42009: variable 'ansible_search_path' from source: unknown 10733 1726776677.42037: calling self._execute() 10733 1726776677.42102: variable 'ansible_host' from source: host vars for 'managed_node2' 10733 1726776677.42110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10733 1726776677.42119: variable 'omit' from source: magic vars 10733 1726776677.42194: variable 'omit' from source: magic vars 10733 1726776677.42219: variable 'omit' from source: magic vars 10733 1726776677.42245: variable 'omit' from source: magic vars 10733 1726776677.42278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10733 1726776677.42302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10733 1726776677.42320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10733 1726776677.42335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10733 1726776677.42348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10733 1726776677.42372: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10733 1726776677.42378: variable 'ansible_host' from source: host vars for 'managed_node2' 10733 1726776677.42382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10733 1726776677.42446: Set connection var ansible_connection to ssh 10733 1726776677.42457: Set connection var ansible_pipelining to False 10733 1726776677.42465: Set connection var ansible_timeout to 10 10733 1726776677.42472: Set connection var ansible_module_compression to ZIP_DEFLATED 10733 1726776677.42477: Set connection var ansible_shell_type to sh 10733 1726776677.42483: Set connection var ansible_shell_executable to /bin/sh 10733 1726776677.42497: variable 'ansible_shell_executable' from source: unknown 10733 1726776677.42501: variable 'ansible_connection' from source: unknown 10733 1726776677.42505: variable 'ansible_module_compression' from source: unknown 10733 1726776677.42508: variable 'ansible_shell_type' from source: unknown 10733 1726776677.42511: variable 'ansible_shell_executable' from source: unknown 10733 1726776677.42515: variable 'ansible_host' from source: host vars for 'managed_node2' 10733 1726776677.42519: variable 'ansible_pipelining' from source: unknown 10733 1726776677.42522: variable 'ansible_timeout' from source: unknown 10733 1726776677.42526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10733 1726776677.42616: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10733 1726776677.42627: variable 'omit' from source: magic vars 10733 1726776677.42635: starting attempt loop 10733 1726776677.42639: running the handler 10733 1726776677.42884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10733 1726776677.44616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10733 1726776677.44663: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10733 1726776677.44689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10733 1726776677.44711: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10733 1726776677.44733: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10733 1726776677.44780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10733 1726776677.44797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10733 1726776677.44812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10733 1726776677.44839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10733 1726776677.44849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10733 1726776677.44926: variable 'kernel_settings_reboot_required' from source: set_fact 10733 1726776677.44943: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 10733 1726776677.44950: handler run complete 10733 1726776677.44965: attempt loop complete, returning result 10733 1726776677.44968: _execute() done 10733 1726776677.44970: dumping result to json 10733 1726776677.44972: done dumping result, returning 10733 1726776677.44976: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [120fa90a-8a95-cec2-986e-00000000001c] 10733 1726776677.44981: sending task result for task 120fa90a-8a95-cec2-986e-00000000001c 10733 1726776677.44999: done sending task result for task 120fa90a-8a95-cec2-986e-00000000001c 10733 1726776677.45002: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776677.45169: no more pending results, returning what we have 8218 1726776677.45173: results queue empty 8218 1726776677.45174: checking for any_errors_fatal 8218 1726776677.45175: done checking for any_errors_fatal 8218 1726776677.45176: checking for max_fail_percentage 8218 1726776677.45177: done checking for max_fail_percentage 8218 1726776677.45178: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.45179: done checking to see if all hosts have failed 8218 1726776677.45179: getting the remaining hosts for this loop 8218 1726776677.45180: done getting the remaining hosts for this loop 8218 1726776677.45183: getting the next task for host managed_node2 8218 1726776677.45188: done getting next task for host managed_node2 8218 1726776677.45190: ^ task is: TASK: Ensure role reported changed 8218 1726776677.45192: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.45195: getting variables 8218 1726776677.45196: in VariableManager get_vars() 8218 1726776677.45232: Calling all_inventory to load vars for managed_node2 8218 1726776677.45234: Calling groups_inventory to load vars for managed_node2 8218 1726776677.45238: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.45248: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.45257: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.45260: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.45371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.45485: done with get_vars() 8218 1726776677.45494: done getting variables 8218 1726776677.45536: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:137 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.039) 0:01:03.286 **** 8218 1726776677.45560: entering _queue_task() for managed_node2/assert 8218 1726776677.45718: worker is 1 (out of 1 available) 8218 1726776677.45733: exiting _queue_task() for managed_node2/assert 8218 1726776677.45744: done queuing things up, now waiting for results queue to drain 8218 1726776677.45745: waiting for pending results... 10734 1726776677.45866: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 10734 1726776677.45960: in run() - task 120fa90a-8a95-cec2-986e-00000000001d 10734 1726776677.45974: variable 'ansible_search_path' from source: unknown 10734 1726776677.46000: calling self._execute() 10734 1726776677.46061: variable 'ansible_host' from source: host vars for 'managed_node2' 10734 1726776677.46337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10734 1726776677.46348: variable 'omit' from source: magic vars 10734 1726776677.46418: variable 'omit' from source: magic vars 10734 1726776677.46442: variable 'omit' from source: magic vars 10734 1726776677.46464: variable 'omit' from source: magic vars 10734 1726776677.46497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10734 1726776677.46522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10734 1726776677.46541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10734 1726776677.46556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10734 1726776677.46567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10734 1726776677.46589: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10734 1726776677.46595: variable 'ansible_host' from source: host vars for 'managed_node2' 10734 1726776677.46600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10734 1726776677.46668: Set connection var ansible_connection to ssh 10734 1726776677.46676: Set connection var ansible_pipelining to False 10734 1726776677.46683: Set connection var ansible_timeout to 10 10734 1726776677.46690: Set connection var ansible_module_compression to ZIP_DEFLATED 10734 1726776677.46695: Set connection var ansible_shell_type to sh 10734 1726776677.46700: Set connection var ansible_shell_executable to /bin/sh 10734 1726776677.46716: variable 'ansible_shell_executable' from source: unknown 10734 1726776677.46720: variable 'ansible_connection' from source: unknown 10734 1726776677.46723: variable 'ansible_module_compression' from source: unknown 10734 1726776677.46726: variable 'ansible_shell_type' from source: unknown 10734 1726776677.46731: variable 'ansible_shell_executable' from source: unknown 10734 1726776677.46734: variable 'ansible_host' from source: host vars for 'managed_node2' 10734 1726776677.46739: variable 'ansible_pipelining' from source: unknown 10734 1726776677.46742: variable 'ansible_timeout' from source: unknown 10734 1726776677.46746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10734 1726776677.46832: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10734 1726776677.46844: variable 'omit' from source: magic vars 10734 1726776677.46850: starting attempt loop 10734 1726776677.46853: running the handler 10734 1726776677.47082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10734 1726776677.48546: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10734 1726776677.48597: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10734 1726776677.48624: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10734 1726776677.48652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10734 1726776677.48673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10734 1726776677.48718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10734 1726776677.48740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10734 1726776677.48761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10734 1726776677.48787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10734 1726776677.48798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10734 1726776677.48872: variable '__kernel_settings_changed' from source: set_fact 10734 1726776677.48888: Evaluated conditional (__kernel_settings_changed | d(false)): True 10734 1726776677.48894: handler run complete 10734 1726776677.48909: attempt loop complete, returning result 10734 1726776677.48912: _execute() done 10734 1726776677.48914: dumping result to json 10734 1726776677.48916: done dumping result, returning 10734 1726776677.48920: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [120fa90a-8a95-cec2-986e-00000000001d] 10734 1726776677.48924: sending task result for task 120fa90a-8a95-cec2-986e-00000000001d 10734 1726776677.48943: done sending task result for task 120fa90a-8a95-cec2-986e-00000000001d 10734 1726776677.48945: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776677.49183: no more pending results, returning what we have 8218 1726776677.49187: results queue empty 8218 1726776677.49188: checking for any_errors_fatal 8218 1726776677.49191: done checking for any_errors_fatal 8218 1726776677.49191: checking for max_fail_percentage 8218 1726776677.49192: done checking for max_fail_percentage 8218 1726776677.49193: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.49193: done checking to see if all hosts have failed 8218 1726776677.49194: getting the remaining hosts for this loop 8218 1726776677.49195: done getting the remaining hosts for this loop 8218 1726776677.49197: getting the next task for host managed_node2 8218 1726776677.49201: done getting next task for host managed_node2 8218 1726776677.49202: ^ task is: TASK: Check sysctl after reboot 8218 1726776677.49203: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.49206: getting variables 8218 1726776677.49207: in VariableManager get_vars() 8218 1726776677.49442: Calling all_inventory to load vars for managed_node2 8218 1726776677.49444: Calling groups_inventory to load vars for managed_node2 8218 1726776677.49446: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.49455: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.49456: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.49463: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.49558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.49664: done with get_vars() 8218 1726776677.49671: done getting variables 8218 1726776677.49708: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:141 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.041) 0:01:03.327 **** 8218 1726776677.49726: entering _queue_task() for managed_node2/shell 8218 1726776677.49877: worker is 1 (out of 1 available) 8218 1726776677.49890: exiting _queue_task() for managed_node2/shell 8218 1726776677.49900: done queuing things up, now waiting for results queue to drain 8218 1726776677.49902: waiting for pending results... 10735 1726776677.50014: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10735 1726776677.50117: in run() - task 120fa90a-8a95-cec2-986e-00000000001e 10735 1726776677.50135: variable 'ansible_search_path' from source: unknown 10735 1726776677.50163: calling self._execute() 10735 1726776677.50227: variable 'ansible_host' from source: host vars for 'managed_node2' 10735 1726776677.50238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10735 1726776677.50246: variable 'omit' from source: magic vars 10735 1726776677.50315: variable 'omit' from source: magic vars 10735 1726776677.50344: variable 'omit' from source: magic vars 10735 1726776677.50367: variable 'omit' from source: magic vars 10735 1726776677.50399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10735 1726776677.50424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10735 1726776677.50443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10735 1726776677.50458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10735 1726776677.50470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10735 1726776677.50491: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10735 1726776677.50495: variable 'ansible_host' from source: host vars for 'managed_node2' 10735 1726776677.50498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10735 1726776677.50566: Set connection var ansible_connection to ssh 10735 1726776677.50574: Set connection var ansible_pipelining to False 10735 1726776677.50580: Set connection var ansible_timeout to 10 10735 1726776677.50587: Set connection var ansible_module_compression to ZIP_DEFLATED 10735 1726776677.50590: Set connection var ansible_shell_type to sh 10735 1726776677.50593: Set connection var ansible_shell_executable to /bin/sh 10735 1726776677.50606: variable 'ansible_shell_executable' from source: unknown 10735 1726776677.50610: variable 'ansible_connection' from source: unknown 10735 1726776677.50612: variable 'ansible_module_compression' from source: unknown 10735 1726776677.50614: variable 'ansible_shell_type' from source: unknown 10735 1726776677.50615: variable 'ansible_shell_executable' from source: unknown 10735 1726776677.50617: variable 'ansible_host' from source: host vars for 'managed_node2' 10735 1726776677.50619: variable 'ansible_pipelining' from source: unknown 10735 1726776677.50620: variable 'ansible_timeout' from source: unknown 10735 1726776677.50622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10735 1726776677.50709: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10735 1726776677.50718: variable 'omit' from source: magic vars 10735 1726776677.50722: starting attempt loop 10735 1726776677.50724: running the handler 10735 1726776677.50732: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10735 1726776677.50750: _low_level_execute_command(): starting 10735 1726776677.50758: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10735 1726776677.53120: stdout chunk (state=2): >>>/root <<< 10735 1726776677.53235: stderr chunk (state=3): >>><<< 10735 1726776677.53241: stdout chunk (state=3): >>><<< 10735 1726776677.53258: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10735 1726776677.53272: _low_level_execute_command(): starting 10735 1726776677.53277: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300 `" && echo ansible-tmp-1726776677.5326593-10735-74476736383300="` echo /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300 `" ) && sleep 0' 10735 1726776677.56034: stdout chunk (state=2): >>>ansible-tmp-1726776677.5326593-10735-74476736383300=/root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300 <<< 10735 1726776677.56165: stderr chunk (state=3): >>><<< 10735 1726776677.56171: stdout chunk (state=3): >>><<< 10735 1726776677.56183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776677.5326593-10735-74476736383300=/root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300 , stderr= 10735 1726776677.56203: variable 'ansible_module_compression' from source: unknown 10735 1726776677.56249: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10735 1726776677.56279: variable 'ansible_facts' from source: unknown 10735 1726776677.56350: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300/AnsiballZ_command.py 10735 1726776677.56439: Sending initial data 10735 1726776677.56446: Sent initial data (154 bytes) 10735 1726776677.58946: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpifk3dwiy /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300/AnsiballZ_command.py <<< 10735 1726776677.60962: stderr chunk (state=3): >>><<< 10735 1726776677.60973: stdout chunk (state=3): >>><<< 10735 1726776677.60996: done transferring module to remote 10735 1726776677.61011: _low_level_execute_command(): starting 10735 1726776677.61017: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300/ /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300/AnsiballZ_command.py && sleep 0' 10735 1726776677.63704: stderr chunk (state=2): >>><<< 10735 1726776677.63715: stdout chunk (state=2): >>><<< 10735 1726776677.63737: _low_level_execute_command() done: rc=0, stdout=, stderr= 10735 1726776677.63743: _low_level_execute_command(): starting 10735 1726776677.63749: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300/AnsiballZ_command.py && sleep 0' 10735 1726776677.79950: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "start": "2024-09-19 16:11:17.791973", "end": "2024-09-19 16:11:17.798006", "delta": "0:00:00.006033", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10735 1726776677.81121: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10735 1726776677.81165: stderr chunk (state=3): >>><<< 10735 1726776677.81172: stdout chunk (state=3): >>><<< 10735 1726776677.81188: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "start": "2024-09-19 16:11:17.791973", "end": "2024-09-19 16:11:17.798006", "delta": "0:00:00.006033", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10735 1726776677.81232: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10735 1726776677.81242: _low_level_execute_command(): starting 10735 1726776677.81248: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776677.5326593-10735-74476736383300/ > /dev/null 2>&1 && sleep 0' 10735 1726776677.83608: stderr chunk (state=2): >>><<< 10735 1726776677.83616: stdout chunk (state=2): >>><<< 10735 1726776677.83631: _low_level_execute_command() done: rc=0, stdout=, stderr= 10735 1726776677.83638: handler run complete 10735 1726776677.83658: Evaluated conditional (False): False 10735 1726776677.83667: attempt loop complete, returning result 10735 1726776677.83672: _execute() done 10735 1726776677.83675: dumping result to json 10735 1726776677.83680: done dumping result, returning 10735 1726776677.83686: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [120fa90a-8a95-cec2-986e-00000000001e] 10735 1726776677.83693: sending task result for task 120fa90a-8a95-cec2-986e-00000000001e 10735 1726776677.83722: done sending task result for task 120fa90a-8a95-cec2-986e-00000000001e 10735 1726776677.83726: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "delta": "0:00:00.006033", "end": "2024-09-19 16:11:17.798006", "rc": 0, "start": "2024-09-19 16:11:17.791973" } 8218 1726776677.83949: no more pending results, returning what we have 8218 1726776677.83951: results queue empty 8218 1726776677.83951: checking for any_errors_fatal 8218 1726776677.83958: done checking for any_errors_fatal 8218 1726776677.83958: checking for max_fail_percentage 8218 1726776677.83959: done checking for max_fail_percentage 8218 1726776677.83960: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.83961: done checking to see if all hosts have failed 8218 1726776677.83961: getting the remaining hosts for this loop 8218 1726776677.83962: done getting the remaining hosts for this loop 8218 1726776677.83964: getting the next task for host managed_node2 8218 1726776677.83969: done getting next task for host managed_node2 8218 1726776677.83971: ^ task is: TASK: Apply kernel_settings for removing 8218 1726776677.83972: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.83975: getting variables 8218 1726776677.83976: in VariableManager get_vars() 8218 1726776677.84003: Calling all_inventory to load vars for managed_node2 8218 1726776677.84005: Calling groups_inventory to load vars for managed_node2 8218 1726776677.84006: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.84013: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.84015: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.84018: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.84131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.84274: done with get_vars() 8218 1726776677.84282: done getting variables TASK [Apply kernel_settings for removing] ************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:147 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.346) 0:01:03.674 **** 8218 1726776677.84349: entering _queue_task() for managed_node2/include_role 8218 1726776677.84500: worker is 1 (out of 1 available) 8218 1726776677.84514: exiting _queue_task() for managed_node2/include_role 8218 1726776677.84525: done queuing things up, now waiting for results queue to drain 8218 1726776677.84527: waiting for pending results... 10755 1726776677.84649: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing 10755 1726776677.84759: in run() - task 120fa90a-8a95-cec2-986e-00000000001f 10755 1726776677.84775: variable 'ansible_search_path' from source: unknown 10755 1726776677.84803: calling self._execute() 10755 1726776677.84872: variable 'ansible_host' from source: host vars for 'managed_node2' 10755 1726776677.84881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10755 1726776677.84889: variable 'omit' from source: magic vars 10755 1726776677.84958: _execute() done 10755 1726776677.84962: dumping result to json 10755 1726776677.84966: done dumping result, returning 10755 1726776677.84970: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing [120fa90a-8a95-cec2-986e-00000000001f] 10755 1726776677.84975: sending task result for task 120fa90a-8a95-cec2-986e-00000000001f 10755 1726776677.85001: done sending task result for task 120fa90a-8a95-cec2-986e-00000000001f 10755 1726776677.85004: WORKER PROCESS EXITING 8218 1726776677.85197: no more pending results, returning what we have 8218 1726776677.85200: in VariableManager get_vars() 8218 1726776677.85228: Calling all_inventory to load vars for managed_node2 8218 1726776677.85232: Calling groups_inventory to load vars for managed_node2 8218 1726776677.85233: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.85240: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.85242: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.85244: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.85350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.85462: done with get_vars() 8218 1726776677.85467: variable 'ansible_search_path' from source: unknown 8218 1726776677.86945: variable 'omit' from source: magic vars 8218 1726776677.86962: variable 'omit' from source: magic vars 8218 1726776677.86970: variable 'omit' from source: magic vars 8218 1726776677.86973: we have included files to process 8218 1726776677.86973: generating all_blocks data 8218 1726776677.86974: done generating all_blocks data 8218 1726776677.86976: processing included file: fedora.linux_system_roles.kernel_settings 8218 1726776677.86989: in VariableManager get_vars() 8218 1726776677.86999: done with get_vars() 8218 1726776677.87018: in VariableManager get_vars() 8218 1726776677.87030: done with get_vars() 8218 1726776677.87059: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8218 1726776677.87096: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8218 1726776677.87111: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8218 1726776677.87159: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8218 1726776677.87480: in VariableManager get_vars() 8218 1726776677.87494: done with get_vars() 8218 1726776677.88297: in VariableManager get_vars() 8218 1726776677.88310: done with get_vars() 8218 1726776677.88411: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8218 1726776677.88784: iterating over new_blocks loaded from include file 8218 1726776677.88796: in VariableManager get_vars() 8218 1726776677.88806: done with get_vars() 8218 1726776677.88807: filtering new block on tags 8218 1726776677.88832: done filtering new block on tags 8218 1726776677.88834: in VariableManager get_vars() 8218 1726776677.88843: done with get_vars() 8218 1726776677.88844: filtering new block on tags 8218 1726776677.88866: done filtering new block on tags 8218 1726776677.88867: in VariableManager get_vars() 8218 1726776677.88877: done with get_vars() 8218 1726776677.88878: filtering new block on tags 8218 1726776677.88956: done filtering new block on tags 8218 1726776677.88958: in VariableManager get_vars() 8218 1726776677.88968: done with get_vars() 8218 1726776677.88969: filtering new block on tags 8218 1726776677.88996: done filtering new block on tags 8218 1726776677.88998: done iterating over new_blocks loaded from include file 8218 1726776677.88998: extending task lists for all hosts with included blocks 8218 1726776677.90378: done extending task lists 8218 1726776677.90379: done processing included files 8218 1726776677.90379: results queue empty 8218 1726776677.90380: checking for any_errors_fatal 8218 1726776677.90382: done checking for any_errors_fatal 8218 1726776677.90383: checking for max_fail_percentage 8218 1726776677.90383: done checking for max_fail_percentage 8218 1726776677.90384: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.90384: done checking to see if all hosts have failed 8218 1726776677.90385: getting the remaining hosts for this loop 8218 1726776677.90385: done getting the remaining hosts for this loop 8218 1726776677.90387: getting the next task for host managed_node2 8218 1726776677.90389: done getting next task for host managed_node2 8218 1726776677.90391: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8218 1726776677.90393: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.90400: getting variables 8218 1726776677.90400: in VariableManager get_vars() 8218 1726776677.90409: Calling all_inventory to load vars for managed_node2 8218 1726776677.90411: Calling groups_inventory to load vars for managed_node2 8218 1726776677.90412: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.90415: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.90416: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.90418: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.90511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.90620: done with get_vars() 8218 1726776677.90627: done getting variables 8218 1726776677.90652: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.063) 0:01:03.737 **** 8218 1726776677.90672: entering _queue_task() for managed_node2/fail 8218 1726776677.90836: worker is 1 (out of 1 available) 8218 1726776677.90849: exiting _queue_task() for managed_node2/fail 8218 1726776677.90861: done queuing things up, now waiting for results queue to drain 8218 1726776677.90862: waiting for pending results... 10756 1726776677.90988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10756 1726776677.91103: in run() - task 120fa90a-8a95-cec2-986e-00000000060d 10756 1726776677.91119: variable 'ansible_search_path' from source: unknown 10756 1726776677.91123: variable 'ansible_search_path' from source: unknown 10756 1726776677.91151: calling self._execute() 10756 1726776677.91219: variable 'ansible_host' from source: host vars for 'managed_node2' 10756 1726776677.91228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10756 1726776677.91238: variable 'omit' from source: magic vars 10756 1726776677.91576: variable 'kernel_settings_sysctl' from source: include params 10756 1726776677.91590: variable '__kernel_settings_state_empty' from source: role '' all vars 10756 1726776677.91599: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 10756 1726776677.91793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10756 1726776677.93507: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10756 1726776677.93555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10756 1726776677.93587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10756 1726776677.93613: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10756 1726776677.93635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10756 1726776677.93688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10756 1726776677.93711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10756 1726776677.93732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10756 1726776677.93761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10756 1726776677.93773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10756 1726776677.93812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10756 1726776677.93832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10756 1726776677.93849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10756 1726776677.93877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10756 1726776677.93887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10756 1726776677.93917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10756 1726776677.93936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10756 1726776677.93955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10756 1726776677.93980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10756 1726776677.93991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10756 1726776677.94169: variable 'kernel_settings_sysctl' from source: include params 10756 1726776677.94219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10756 1726776677.94326: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10756 1726776677.94359: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10756 1726776677.94382: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10756 1726776677.94418: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10756 1726776677.94450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10756 1726776677.94470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10756 1726776677.94487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10756 1726776677.94505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10756 1726776677.94533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10756 1726776677.94549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10756 1726776677.94569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10756 1726776677.94586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10756 1726776677.94606: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 10756 1726776677.94611: when evaluation is False, skipping this task 10756 1726776677.94615: _execute() done 10756 1726776677.94618: dumping result to json 10756 1726776677.94621: done dumping result, returning 10756 1726776677.94627: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-cec2-986e-00000000060d] 10756 1726776677.94636: sending task result for task 120fa90a-8a95-cec2-986e-00000000060d 10756 1726776677.94660: done sending task result for task 120fa90a-8a95-cec2-986e-00000000060d 10756 1726776677.94663: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8218 1726776677.94776: no more pending results, returning what we have 8218 1726776677.94780: results queue empty 8218 1726776677.94780: checking for any_errors_fatal 8218 1726776677.94782: done checking for any_errors_fatal 8218 1726776677.94782: checking for max_fail_percentage 8218 1726776677.94784: done checking for max_fail_percentage 8218 1726776677.94785: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.94786: done checking to see if all hosts have failed 8218 1726776677.94786: getting the remaining hosts for this loop 8218 1726776677.94787: done getting the remaining hosts for this loop 8218 1726776677.94790: getting the next task for host managed_node2 8218 1726776677.94796: done getting next task for host managed_node2 8218 1726776677.94799: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8218 1726776677.94801: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.94816: getting variables 8218 1726776677.94818: in VariableManager get_vars() 8218 1726776677.94853: Calling all_inventory to load vars for managed_node2 8218 1726776677.94856: Calling groups_inventory to load vars for managed_node2 8218 1726776677.94858: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.94866: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.94869: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.94871: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.94997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.95122: done with get_vars() 8218 1726776677.95133: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.045) 0:01:03.782 **** 8218 1726776677.95200: entering _queue_task() for managed_node2/include_tasks 8218 1726776677.95363: worker is 1 (out of 1 available) 8218 1726776677.95377: exiting _queue_task() for managed_node2/include_tasks 8218 1726776677.95389: done queuing things up, now waiting for results queue to drain 8218 1726776677.95391: waiting for pending results... 10757 1726776677.95516: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10757 1726776677.95633: in run() - task 120fa90a-8a95-cec2-986e-00000000060e 10757 1726776677.95650: variable 'ansible_search_path' from source: unknown 10757 1726776677.95656: variable 'ansible_search_path' from source: unknown 10757 1726776677.95683: calling self._execute() 10757 1726776677.95748: variable 'ansible_host' from source: host vars for 'managed_node2' 10757 1726776677.95759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10757 1726776677.95767: variable 'omit' from source: magic vars 10757 1726776677.95843: _execute() done 10757 1726776677.95849: dumping result to json 10757 1726776677.95855: done dumping result, returning 10757 1726776677.95861: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-cec2-986e-00000000060e] 10757 1726776677.95868: sending task result for task 120fa90a-8a95-cec2-986e-00000000060e 10757 1726776677.95888: done sending task result for task 120fa90a-8a95-cec2-986e-00000000060e 10757 1726776677.95891: WORKER PROCESS EXITING 8218 1726776677.96087: no more pending results, returning what we have 8218 1726776677.96090: in VariableManager get_vars() 8218 1726776677.96118: Calling all_inventory to load vars for managed_node2 8218 1726776677.96120: Calling groups_inventory to load vars for managed_node2 8218 1726776677.96121: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.96131: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.96133: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.96135: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.96277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.96387: done with get_vars() 8218 1726776677.96392: variable 'ansible_search_path' from source: unknown 8218 1726776677.96393: variable 'ansible_search_path' from source: unknown 8218 1726776677.96416: we have included files to process 8218 1726776677.96417: generating all_blocks data 8218 1726776677.96418: done generating all_blocks data 8218 1726776677.96424: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776677.96425: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776677.96426: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8218 1726776677.96870: done processing included file 8218 1726776677.96872: iterating over new_blocks loaded from include file 8218 1726776677.96873: in VariableManager get_vars() 8218 1726776677.96888: done with get_vars() 8218 1726776677.96889: filtering new block on tags 8218 1726776677.96905: done filtering new block on tags 8218 1726776677.96907: in VariableManager get_vars() 8218 1726776677.96919: done with get_vars() 8218 1726776677.96920: filtering new block on tags 8218 1726776677.96944: done filtering new block on tags 8218 1726776677.96945: in VariableManager get_vars() 8218 1726776677.96961: done with get_vars() 8218 1726776677.96962: filtering new block on tags 8218 1726776677.96984: done filtering new block on tags 8218 1726776677.96986: in VariableManager get_vars() 8218 1726776677.96999: done with get_vars() 8218 1726776677.96999: filtering new block on tags 8218 1726776677.97014: done filtering new block on tags 8218 1726776677.97016: done iterating over new_blocks loaded from include file 8218 1726776677.97016: extending task lists for all hosts with included blocks 8218 1726776677.97132: done extending task lists 8218 1726776677.97133: done processing included files 8218 1726776677.97134: results queue empty 8218 1726776677.97134: checking for any_errors_fatal 8218 1726776677.97137: done checking for any_errors_fatal 8218 1726776677.97137: checking for max_fail_percentage 8218 1726776677.97138: done checking for max_fail_percentage 8218 1726776677.97138: checking to see if all hosts have failed and the running result is not ok 8218 1726776677.97139: done checking to see if all hosts have failed 8218 1726776677.97139: getting the remaining hosts for this loop 8218 1726776677.97140: done getting the remaining hosts for this loop 8218 1726776677.97141: getting the next task for host managed_node2 8218 1726776677.97144: done getting next task for host managed_node2 8218 1726776677.97145: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8218 1726776677.97147: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776677.97154: getting variables 8218 1726776677.97155: in VariableManager get_vars() 8218 1726776677.97163: Calling all_inventory to load vars for managed_node2 8218 1726776677.97165: Calling groups_inventory to load vars for managed_node2 8218 1726776677.97167: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776677.97170: Calling all_plugins_play to load vars for managed_node2 8218 1726776677.97172: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776677.97173: Calling groups_plugins_play to load vars for managed_node2 8218 1726776677.97248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776677.97355: done with get_vars() 8218 1726776677.97361: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:11:17 -0400 (0:00:00.022) 0:01:03.804 **** 8218 1726776677.97408: entering _queue_task() for managed_node2/setup 8218 1726776677.97566: worker is 1 (out of 1 available) 8218 1726776677.97579: exiting _queue_task() for managed_node2/setup 8218 1726776677.97591: done queuing things up, now waiting for results queue to drain 8218 1726776677.97593: waiting for pending results... 10758 1726776677.97720: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10758 1726776677.97849: in run() - task 120fa90a-8a95-cec2-986e-000000000789 10758 1726776677.97867: variable 'ansible_search_path' from source: unknown 10758 1726776677.97871: variable 'ansible_search_path' from source: unknown 10758 1726776677.97897: calling self._execute() 10758 1726776677.97962: variable 'ansible_host' from source: host vars for 'managed_node2' 10758 1726776677.97972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10758 1726776677.97980: variable 'omit' from source: magic vars 10758 1726776677.98341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10758 1726776678.00041: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10758 1726776678.00088: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10758 1726776678.00116: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10758 1726776678.00148: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10758 1726776678.00171: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10758 1726776678.00225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10758 1726776678.00248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10758 1726776678.00268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10758 1726776678.00295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10758 1726776678.00306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10758 1726776678.00347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10758 1726776678.00366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10758 1726776678.00383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10758 1726776678.00408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10758 1726776678.00419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10758 1726776678.00530: variable '__kernel_settings_required_facts' from source: role '' all vars 10758 1726776678.00541: variable 'ansible_facts' from source: unknown 10758 1726776678.00596: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10758 1726776678.00601: when evaluation is False, skipping this task 10758 1726776678.00605: _execute() done 10758 1726776678.00609: dumping result to json 10758 1726776678.00613: done dumping result, returning 10758 1726776678.00619: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-cec2-986e-000000000789] 10758 1726776678.00625: sending task result for task 120fa90a-8a95-cec2-986e-000000000789 10758 1726776678.00648: done sending task result for task 120fa90a-8a95-cec2-986e-000000000789 10758 1726776678.00652: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8218 1726776678.00756: no more pending results, returning what we have 8218 1726776678.00759: results queue empty 8218 1726776678.00760: checking for any_errors_fatal 8218 1726776678.00762: done checking for any_errors_fatal 8218 1726776678.00762: checking for max_fail_percentage 8218 1726776678.00764: done checking for max_fail_percentage 8218 1726776678.00764: checking to see if all hosts have failed and the running result is not ok 8218 1726776678.00765: done checking to see if all hosts have failed 8218 1726776678.00766: getting the remaining hosts for this loop 8218 1726776678.00767: done getting the remaining hosts for this loop 8218 1726776678.00770: getting the next task for host managed_node2 8218 1726776678.00778: done getting next task for host managed_node2 8218 1726776678.00782: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8218 1726776678.00785: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776678.00799: getting variables 8218 1726776678.00801: in VariableManager get_vars() 8218 1726776678.00834: Calling all_inventory to load vars for managed_node2 8218 1726776678.00837: Calling groups_inventory to load vars for managed_node2 8218 1726776678.00839: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776678.00847: Calling all_plugins_play to load vars for managed_node2 8218 1726776678.00850: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776678.00852: Calling groups_plugins_play to load vars for managed_node2 8218 1726776678.00971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776678.01267: done with get_vars() 8218 1726776678.01274: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:11:18 -0400 (0:00:00.039) 0:01:03.843 **** 8218 1726776678.01338: entering _queue_task() for managed_node2/stat 8218 1726776678.01489: worker is 1 (out of 1 available) 8218 1726776678.01501: exiting _queue_task() for managed_node2/stat 8218 1726776678.01511: done queuing things up, now waiting for results queue to drain 8218 1726776678.01513: waiting for pending results... 10759 1726776678.01641: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10759 1726776678.01768: in run() - task 120fa90a-8a95-cec2-986e-00000000078b 10759 1726776678.01783: variable 'ansible_search_path' from source: unknown 10759 1726776678.01787: variable 'ansible_search_path' from source: unknown 10759 1726776678.01812: calling self._execute() 10759 1726776678.01876: variable 'ansible_host' from source: host vars for 'managed_node2' 10759 1726776678.01885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10759 1726776678.01893: variable 'omit' from source: magic vars 10759 1726776678.02212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10759 1726776678.02382: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10759 1726776678.02415: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10759 1726776678.02442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10759 1726776678.02472: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10759 1726776678.02531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10759 1726776678.02550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10759 1726776678.02571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10759 1726776678.02590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10759 1726776678.02681: variable '__kernel_settings_is_ostree' from source: set_fact 10759 1726776678.02692: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10759 1726776678.02696: when evaluation is False, skipping this task 10759 1726776678.02700: _execute() done 10759 1726776678.02704: dumping result to json 10759 1726776678.02707: done dumping result, returning 10759 1726776678.02712: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-cec2-986e-00000000078b] 10759 1726776678.02719: sending task result for task 120fa90a-8a95-cec2-986e-00000000078b 10759 1726776678.02743: done sending task result for task 120fa90a-8a95-cec2-986e-00000000078b 10759 1726776678.02747: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776678.02859: no more pending results, returning what we have 8218 1726776678.02861: results queue empty 8218 1726776678.02862: checking for any_errors_fatal 8218 1726776678.02870: done checking for any_errors_fatal 8218 1726776678.02871: checking for max_fail_percentage 8218 1726776678.02872: done checking for max_fail_percentage 8218 1726776678.02873: checking to see if all hosts have failed and the running result is not ok 8218 1726776678.02874: done checking to see if all hosts have failed 8218 1726776678.02874: getting the remaining hosts for this loop 8218 1726776678.02875: done getting the remaining hosts for this loop 8218 1726776678.02878: getting the next task for host managed_node2 8218 1726776678.02884: done getting next task for host managed_node2 8218 1726776678.02887: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8218 1726776678.02890: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776678.02903: getting variables 8218 1726776678.02904: in VariableManager get_vars() 8218 1726776678.02932: Calling all_inventory to load vars for managed_node2 8218 1726776678.02934: Calling groups_inventory to load vars for managed_node2 8218 1726776678.02936: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776678.02942: Calling all_plugins_play to load vars for managed_node2 8218 1726776678.02944: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776678.02945: Calling groups_plugins_play to load vars for managed_node2 8218 1726776678.03047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776678.03167: done with get_vars() 8218 1726776678.03174: done getting variables 8218 1726776678.03211: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:11:18 -0400 (0:00:00.018) 0:01:03.862 **** 8218 1726776678.03234: entering _queue_task() for managed_node2/set_fact 8218 1726776678.03372: worker is 1 (out of 1 available) 8218 1726776678.03383: exiting _queue_task() for managed_node2/set_fact 8218 1726776678.03393: done queuing things up, now waiting for results queue to drain 8218 1726776678.03395: waiting for pending results... 10760 1726776678.03515: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10760 1726776678.03624: in run() - task 120fa90a-8a95-cec2-986e-00000000078c 10760 1726776678.03641: variable 'ansible_search_path' from source: unknown 10760 1726776678.03645: variable 'ansible_search_path' from source: unknown 10760 1726776678.03673: calling self._execute() 10760 1726776678.03737: variable 'ansible_host' from source: host vars for 'managed_node2' 10760 1726776678.03746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10760 1726776678.03757: variable 'omit' from source: magic vars 10760 1726776678.04072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10760 1726776678.04295: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10760 1726776678.04330: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10760 1726776678.04357: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10760 1726776678.04384: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10760 1726776678.04439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10760 1726776678.04461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10760 1726776678.04480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10760 1726776678.04500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10760 1726776678.04586: variable '__kernel_settings_is_ostree' from source: set_fact 10760 1726776678.04596: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10760 1726776678.04599: when evaluation is False, skipping this task 10760 1726776678.04601: _execute() done 10760 1726776678.04603: dumping result to json 10760 1726776678.04605: done dumping result, returning 10760 1726776678.04610: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-cec2-986e-00000000078c] 10760 1726776678.04614: sending task result for task 120fa90a-8a95-cec2-986e-00000000078c 10760 1726776678.04649: done sending task result for task 120fa90a-8a95-cec2-986e-00000000078c 10760 1726776678.04656: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776678.04747: no more pending results, returning what we have 8218 1726776678.04750: results queue empty 8218 1726776678.04751: checking for any_errors_fatal 8218 1726776678.04755: done checking for any_errors_fatal 8218 1726776678.04756: checking for max_fail_percentage 8218 1726776678.04757: done checking for max_fail_percentage 8218 1726776678.04757: checking to see if all hosts have failed and the running result is not ok 8218 1726776678.04758: done checking to see if all hosts have failed 8218 1726776678.04759: getting the remaining hosts for this loop 8218 1726776678.04760: done getting the remaining hosts for this loop 8218 1726776678.04763: getting the next task for host managed_node2 8218 1726776678.04770: done getting next task for host managed_node2 8218 1726776678.04773: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8218 1726776678.04776: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776678.04790: getting variables 8218 1726776678.04791: in VariableManager get_vars() 8218 1726776678.04821: Calling all_inventory to load vars for managed_node2 8218 1726776678.04824: Calling groups_inventory to load vars for managed_node2 8218 1726776678.04825: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776678.04835: Calling all_plugins_play to load vars for managed_node2 8218 1726776678.04838: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776678.04840: Calling groups_plugins_play to load vars for managed_node2 8218 1726776678.04988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776678.05103: done with get_vars() 8218 1726776678.05109: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:11:18 -0400 (0:00:00.019) 0:01:03.882 **** 8218 1726776678.05174: entering _queue_task() for managed_node2/stat 8218 1726776678.05316: worker is 1 (out of 1 available) 8218 1726776678.05330: exiting _queue_task() for managed_node2/stat 8218 1726776678.05341: done queuing things up, now waiting for results queue to drain 8218 1726776678.05343: waiting for pending results... 10761 1726776678.05469: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10761 1726776678.05592: in run() - task 120fa90a-8a95-cec2-986e-00000000078e 10761 1726776678.05608: variable 'ansible_search_path' from source: unknown 10761 1726776678.05612: variable 'ansible_search_path' from source: unknown 10761 1726776678.05640: calling self._execute() 10761 1726776678.05704: variable 'ansible_host' from source: host vars for 'managed_node2' 10761 1726776678.05713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10761 1726776678.05721: variable 'omit' from source: magic vars 10761 1726776678.06050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10761 1726776678.06223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10761 1726776678.06261: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10761 1726776678.06289: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10761 1726776678.06316: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10761 1726776678.06379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10761 1726776678.06400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10761 1726776678.06419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10761 1726776678.06440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10761 1726776678.06528: variable '__kernel_settings_is_transactional' from source: set_fact 10761 1726776678.06541: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10761 1726776678.06546: when evaluation is False, skipping this task 10761 1726776678.06550: _execute() done 10761 1726776678.06556: dumping result to json 10761 1726776678.06559: done dumping result, returning 10761 1726776678.06565: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-cec2-986e-00000000078e] 10761 1726776678.06571: sending task result for task 120fa90a-8a95-cec2-986e-00000000078e 10761 1726776678.06593: done sending task result for task 120fa90a-8a95-cec2-986e-00000000078e 10761 1726776678.06597: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776678.06701: no more pending results, returning what we have 8218 1726776678.06704: results queue empty 8218 1726776678.06705: checking for any_errors_fatal 8218 1726776678.06710: done checking for any_errors_fatal 8218 1726776678.06710: checking for max_fail_percentage 8218 1726776678.06711: done checking for max_fail_percentage 8218 1726776678.06712: checking to see if all hosts have failed and the running result is not ok 8218 1726776678.06713: done checking to see if all hosts have failed 8218 1726776678.06714: getting the remaining hosts for this loop 8218 1726776678.06715: done getting the remaining hosts for this loop 8218 1726776678.06718: getting the next task for host managed_node2 8218 1726776678.06724: done getting next task for host managed_node2 8218 1726776678.06727: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8218 1726776678.06731: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776678.06746: getting variables 8218 1726776678.06747: in VariableManager get_vars() 8218 1726776678.06778: Calling all_inventory to load vars for managed_node2 8218 1726776678.06780: Calling groups_inventory to load vars for managed_node2 8218 1726776678.06782: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776678.06790: Calling all_plugins_play to load vars for managed_node2 8218 1726776678.06793: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776678.06795: Calling groups_plugins_play to load vars for managed_node2 8218 1726776678.06906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776678.07034: done with get_vars() 8218 1726776678.07042: done getting variables 8218 1726776678.07080: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:11:18 -0400 (0:00:00.019) 0:01:03.901 **** 8218 1726776678.07102: entering _queue_task() for managed_node2/set_fact 8218 1726776678.07251: worker is 1 (out of 1 available) 8218 1726776678.07262: exiting _queue_task() for managed_node2/set_fact 8218 1726776678.07274: done queuing things up, now waiting for results queue to drain 8218 1726776678.07275: waiting for pending results... 10762 1726776678.07401: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10762 1726776678.07521: in run() - task 120fa90a-8a95-cec2-986e-00000000078f 10762 1726776678.07537: variable 'ansible_search_path' from source: unknown 10762 1726776678.07541: variable 'ansible_search_path' from source: unknown 10762 1726776678.07569: calling self._execute() 10762 1726776678.07638: variable 'ansible_host' from source: host vars for 'managed_node2' 10762 1726776678.07647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10762 1726776678.07657: variable 'omit' from source: magic vars 10762 1726776678.07975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10762 1726776678.08200: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10762 1726776678.08235: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10762 1726776678.08267: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10762 1726776678.08303: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10762 1726776678.08379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10762 1726776678.08396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10762 1726776678.08411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10762 1726776678.08426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10762 1726776678.08510: variable '__kernel_settings_is_transactional' from source: set_fact 10762 1726776678.08519: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10762 1726776678.08522: when evaluation is False, skipping this task 10762 1726776678.08524: _execute() done 10762 1726776678.08526: dumping result to json 10762 1726776678.08528: done dumping result, returning 10762 1726776678.08534: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-cec2-986e-00000000078f] 10762 1726776678.08538: sending task result for task 120fa90a-8a95-cec2-986e-00000000078f 10762 1726776678.08560: done sending task result for task 120fa90a-8a95-cec2-986e-00000000078f 10762 1726776678.08562: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776678.08748: no more pending results, returning what we have 8218 1726776678.08751: results queue empty 8218 1726776678.08751: checking for any_errors_fatal 8218 1726776678.08758: done checking for any_errors_fatal 8218 1726776678.08759: checking for max_fail_percentage 8218 1726776678.08760: done checking for max_fail_percentage 8218 1726776678.08760: checking to see if all hosts have failed and the running result is not ok 8218 1726776678.08761: done checking to see if all hosts have failed 8218 1726776678.08761: getting the remaining hosts for this loop 8218 1726776678.08762: done getting the remaining hosts for this loop 8218 1726776678.08764: getting the next task for host managed_node2 8218 1726776678.08770: done getting next task for host managed_node2 8218 1726776678.08772: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8218 1726776678.08774: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776678.08785: getting variables 8218 1726776678.08786: in VariableManager get_vars() 8218 1726776678.08808: Calling all_inventory to load vars for managed_node2 8218 1726776678.08810: Calling groups_inventory to load vars for managed_node2 8218 1726776678.08811: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776678.08819: Calling all_plugins_play to load vars for managed_node2 8218 1726776678.08822: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776678.08823: Calling groups_plugins_play to load vars for managed_node2 8218 1726776678.08977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776678.09092: done with get_vars() 8218 1726776678.09098: done getting variables 8218 1726776678.09138: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:11:18 -0400 (0:00:00.020) 0:01:03.922 **** 8218 1726776678.09163: entering _queue_task() for managed_node2/include_vars 8218 1726776678.09301: worker is 1 (out of 1 available) 8218 1726776678.09315: exiting _queue_task() for managed_node2/include_vars 8218 1726776678.09327: done queuing things up, now waiting for results queue to drain 8218 1726776678.09330: waiting for pending results... 10763 1726776678.09449: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10763 1726776678.09566: in run() - task 120fa90a-8a95-cec2-986e-000000000791 10763 1726776678.09582: variable 'ansible_search_path' from source: unknown 10763 1726776678.09586: variable 'ansible_search_path' from source: unknown 10763 1726776678.09611: calling self._execute() 10763 1726776678.09676: variable 'ansible_host' from source: host vars for 'managed_node2' 10763 1726776678.09685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10763 1726776678.09693: variable 'omit' from source: magic vars 10763 1726776678.09763: variable 'omit' from source: magic vars 10763 1726776678.09805: variable 'omit' from source: magic vars 10763 1726776678.10076: variable 'ffparams' from source: task vars 10763 1726776678.10172: variable 'ansible_facts' from source: unknown 10763 1726776678.10301: variable 'ansible_facts' from source: unknown 10763 1726776678.10471: variable 'ansible_facts' from source: unknown 10763 1726776678.10602: variable 'ansible_facts' from source: unknown 10763 1726776678.10713: variable 'role_path' from source: magic vars 10763 1726776678.10868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10763 1726776678.11073: Loaded config def from plugin (lookup/first_found) 10763 1726776678.11081: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 10763 1726776678.11108: variable 'ansible_search_path' from source: unknown 10763 1726776678.11126: variable 'ansible_search_path' from source: unknown 10763 1726776678.11136: variable 'ansible_search_path' from source: unknown 10763 1726776678.11145: variable 'ansible_search_path' from source: unknown 10763 1726776678.11152: variable 'ansible_search_path' from source: unknown 10763 1726776678.11167: variable 'omit' from source: magic vars 10763 1726776678.11185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10763 1726776678.11203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10763 1726776678.11219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10763 1726776678.11235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10763 1726776678.11244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10763 1726776678.11266: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10763 1726776678.11271: variable 'ansible_host' from source: host vars for 'managed_node2' 10763 1726776678.11275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10763 1726776678.11335: Set connection var ansible_connection to ssh 10763 1726776678.11344: Set connection var ansible_pipelining to False 10763 1726776678.11350: Set connection var ansible_timeout to 10 10763 1726776678.11358: Set connection var ansible_module_compression to ZIP_DEFLATED 10763 1726776678.11363: Set connection var ansible_shell_type to sh 10763 1726776678.11368: Set connection var ansible_shell_executable to /bin/sh 10763 1726776678.11384: variable 'ansible_shell_executable' from source: unknown 10763 1726776678.11387: variable 'ansible_connection' from source: unknown 10763 1726776678.11391: variable 'ansible_module_compression' from source: unknown 10763 1726776678.11395: variable 'ansible_shell_type' from source: unknown 10763 1726776678.11398: variable 'ansible_shell_executable' from source: unknown 10763 1726776678.11402: variable 'ansible_host' from source: host vars for 'managed_node2' 10763 1726776678.11406: variable 'ansible_pipelining' from source: unknown 10763 1726776678.11410: variable 'ansible_timeout' from source: unknown 10763 1726776678.11414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10763 1726776678.11483: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10763 1726776678.11495: variable 'omit' from source: magic vars 10763 1726776678.11500: starting attempt loop 10763 1726776678.11503: running the handler 10763 1726776678.11547: handler run complete 10763 1726776678.11557: attempt loop complete, returning result 10763 1726776678.11561: _execute() done 10763 1726776678.11564: dumping result to json 10763 1726776678.11569: done dumping result, returning 10763 1726776678.11575: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-cec2-986e-000000000791] 10763 1726776678.11581: sending task result for task 120fa90a-8a95-cec2-986e-000000000791 10763 1726776678.11605: done sending task result for task 120fa90a-8a95-cec2-986e-000000000791 10763 1726776678.11609: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8218 1726776678.11787: no more pending results, returning what we have 8218 1726776678.11791: results queue empty 8218 1726776678.11792: checking for any_errors_fatal 8218 1726776678.11802: done checking for any_errors_fatal 8218 1726776678.11803: checking for max_fail_percentage 8218 1726776678.11804: done checking for max_fail_percentage 8218 1726776678.11805: checking to see if all hosts have failed and the running result is not ok 8218 1726776678.11806: done checking to see if all hosts have failed 8218 1726776678.11807: getting the remaining hosts for this loop 8218 1726776678.11809: done getting the remaining hosts for this loop 8218 1726776678.11813: getting the next task for host managed_node2 8218 1726776678.11821: done getting next task for host managed_node2 8218 1726776678.11824: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8218 1726776678.11826: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776678.11839: getting variables 8218 1726776678.11841: in VariableManager get_vars() 8218 1726776678.11881: Calling all_inventory to load vars for managed_node2 8218 1726776678.11884: Calling groups_inventory to load vars for managed_node2 8218 1726776678.11886: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776678.11898: Calling all_plugins_play to load vars for managed_node2 8218 1726776678.11901: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776678.11904: Calling groups_plugins_play to load vars for managed_node2 8218 1726776678.12091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776678.12294: done with get_vars() 8218 1726776678.12304: done getting variables 8218 1726776678.12362: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:11:18 -0400 (0:00:00.032) 0:01:03.954 **** 8218 1726776678.12393: entering _queue_task() for managed_node2/package 8218 1726776678.12552: worker is 1 (out of 1 available) 8218 1726776678.12567: exiting _queue_task() for managed_node2/package 8218 1726776678.12579: done queuing things up, now waiting for results queue to drain 8218 1726776678.12581: waiting for pending results... 10765 1726776678.12704: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10765 1726776678.12813: in run() - task 120fa90a-8a95-cec2-986e-00000000060f 10765 1726776678.12830: variable 'ansible_search_path' from source: unknown 10765 1726776678.12835: variable 'ansible_search_path' from source: unknown 10765 1726776678.12864: calling self._execute() 10765 1726776678.12931: variable 'ansible_host' from source: host vars for 'managed_node2' 10765 1726776678.12939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10765 1726776678.12947: variable 'omit' from source: magic vars 10765 1726776678.13025: variable 'omit' from source: magic vars 10765 1726776678.13059: variable 'omit' from source: magic vars 10765 1726776678.13077: variable '__kernel_settings_packages' from source: include_vars 10765 1726776678.13291: variable '__kernel_settings_packages' from source: include_vars 10765 1726776678.13508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10765 1726776678.15189: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10765 1726776678.15256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10765 1726776678.15293: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10765 1726776678.15330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10765 1726776678.15369: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10765 1726776678.15457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10765 1726776678.15484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10765 1726776678.15507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10765 1726776678.15546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10765 1726776678.15560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10765 1726776678.15656: variable '__kernel_settings_is_ostree' from source: set_fact 10765 1726776678.15664: variable 'omit' from source: magic vars 10765 1726776678.15692: variable 'omit' from source: magic vars 10765 1726776678.15718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10765 1726776678.15744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10765 1726776678.15762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10765 1726776678.15779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10765 1726776678.15790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10765 1726776678.15818: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10765 1726776678.15825: variable 'ansible_host' from source: host vars for 'managed_node2' 10765 1726776678.15830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10765 1726776678.15923: Set connection var ansible_connection to ssh 10765 1726776678.15933: Set connection var ansible_pipelining to False 10765 1726776678.15940: Set connection var ansible_timeout to 10 10765 1726776678.15948: Set connection var ansible_module_compression to ZIP_DEFLATED 10765 1726776678.15953: Set connection var ansible_shell_type to sh 10765 1726776678.15958: Set connection var ansible_shell_executable to /bin/sh 10765 1726776678.15978: variable 'ansible_shell_executable' from source: unknown 10765 1726776678.15982: variable 'ansible_connection' from source: unknown 10765 1726776678.15984: variable 'ansible_module_compression' from source: unknown 10765 1726776678.15986: variable 'ansible_shell_type' from source: unknown 10765 1726776678.15987: variable 'ansible_shell_executable' from source: unknown 10765 1726776678.15989: variable 'ansible_host' from source: host vars for 'managed_node2' 10765 1726776678.15991: variable 'ansible_pipelining' from source: unknown 10765 1726776678.15993: variable 'ansible_timeout' from source: unknown 10765 1726776678.15995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10765 1726776678.16069: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10765 1726776678.16082: variable 'omit' from source: magic vars 10765 1726776678.16087: starting attempt loop 10765 1726776678.16091: running the handler 10765 1726776678.16158: variable 'ansible_facts' from source: unknown 10765 1726776678.16240: _low_level_execute_command(): starting 10765 1726776678.16249: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10765 1726776678.19030: stdout chunk (state=2): >>>/root <<< 10765 1726776678.19149: stderr chunk (state=3): >>><<< 10765 1726776678.19157: stdout chunk (state=3): >>><<< 10765 1726776678.19178: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10765 1726776678.19190: _low_level_execute_command(): starting 10765 1726776678.19196: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118 `" && echo ansible-tmp-1726776678.1918643-10765-163061444382118="` echo /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118 `" ) && sleep 0' 10765 1726776678.21976: stdout chunk (state=2): >>>ansible-tmp-1726776678.1918643-10765-163061444382118=/root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118 <<< 10765 1726776678.22200: stderr chunk (state=3): >>><<< 10765 1726776678.22208: stdout chunk (state=3): >>><<< 10765 1726776678.22223: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776678.1918643-10765-163061444382118=/root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118 , stderr= 10765 1726776678.22249: variable 'ansible_module_compression' from source: unknown 10765 1726776678.22291: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10765 1726776678.22327: variable 'ansible_facts' from source: unknown 10765 1726776678.22416: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118/AnsiballZ_dnf.py 10765 1726776678.22512: Sending initial data 10765 1726776678.22519: Sent initial data (151 bytes) 10765 1726776678.25035: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpc4y5y97o /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118/AnsiballZ_dnf.py <<< 10765 1726776678.26528: stderr chunk (state=3): >>><<< 10765 1726776678.26536: stdout chunk (state=3): >>><<< 10765 1726776678.26556: done transferring module to remote 10765 1726776678.26567: _low_level_execute_command(): starting 10765 1726776678.26572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118/ /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118/AnsiballZ_dnf.py && sleep 0' 10765 1726776678.29078: stderr chunk (state=2): >>><<< 10765 1726776678.29085: stdout chunk (state=2): >>><<< 10765 1726776678.29096: _low_level_execute_command() done: rc=0, stdout=, stderr= 10765 1726776678.29100: _low_level_execute_command(): starting 10765 1726776678.29105: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118/AnsiballZ_dnf.py && sleep 0' 10765 1726776680.88898: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10765 1726776680.97149: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10765 1726776680.97199: stderr chunk (state=3): >>><<< 10765 1726776680.97206: stdout chunk (state=3): >>><<< 10765 1726776680.97222: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10765 1726776680.97260: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10765 1726776680.97269: _low_level_execute_command(): starting 10765 1726776680.97275: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776678.1918643-10765-163061444382118/ > /dev/null 2>&1 && sleep 0' 10765 1726776680.99736: stderr chunk (state=2): >>><<< 10765 1726776680.99746: stdout chunk (state=2): >>><<< 10765 1726776680.99766: _low_level_execute_command() done: rc=0, stdout=, stderr= 10765 1726776680.99775: handler run complete 10765 1726776680.99814: attempt loop complete, returning result 10765 1726776680.99820: _execute() done 10765 1726776680.99823: dumping result to json 10765 1726776680.99830: done dumping result, returning 10765 1726776680.99838: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-cec2-986e-00000000060f] 10765 1726776680.99843: sending task result for task 120fa90a-8a95-cec2-986e-00000000060f 10765 1726776680.99879: done sending task result for task 120fa90a-8a95-cec2-986e-00000000060f 10765 1726776680.99883: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8218 1726776681.00217: no more pending results, returning what we have 8218 1726776681.00221: results queue empty 8218 1726776681.00222: checking for any_errors_fatal 8218 1726776681.00234: done checking for any_errors_fatal 8218 1726776681.00235: checking for max_fail_percentage 8218 1726776681.00236: done checking for max_fail_percentage 8218 1726776681.00237: checking to see if all hosts have failed and the running result is not ok 8218 1726776681.00238: done checking to see if all hosts have failed 8218 1726776681.00239: getting the remaining hosts for this loop 8218 1726776681.00240: done getting the remaining hosts for this loop 8218 1726776681.00243: getting the next task for host managed_node2 8218 1726776681.00252: done getting next task for host managed_node2 8218 1726776681.00255: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8218 1726776681.00260: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776681.00272: getting variables 8218 1726776681.00274: in VariableManager get_vars() 8218 1726776681.00311: Calling all_inventory to load vars for managed_node2 8218 1726776681.00313: Calling groups_inventory to load vars for managed_node2 8218 1726776681.00318: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776681.00327: Calling all_plugins_play to load vars for managed_node2 8218 1726776681.00332: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776681.00335: Calling groups_plugins_play to load vars for managed_node2 8218 1726776681.00579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776681.00788: done with get_vars() 8218 1726776681.00800: done getting variables 8218 1726776681.00868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:11:21 -0400 (0:00:02.885) 0:01:06.839 **** 8218 1726776681.00899: entering _queue_task() for managed_node2/debug 8218 1726776681.01114: worker is 1 (out of 1 available) 8218 1726776681.01135: exiting _queue_task() for managed_node2/debug 8218 1726776681.01147: done queuing things up, now waiting for results queue to drain 8218 1726776681.01149: waiting for pending results... 10847 1726776681.01383: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10847 1726776681.01525: in run() - task 120fa90a-8a95-cec2-986e-000000000611 10847 1726776681.01546: variable 'ansible_search_path' from source: unknown 10847 1726776681.01550: variable 'ansible_search_path' from source: unknown 10847 1726776681.01584: calling self._execute() 10847 1726776681.01675: variable 'ansible_host' from source: host vars for 'managed_node2' 10847 1726776681.01685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10847 1726776681.01694: variable 'omit' from source: magic vars 10847 1726776681.02167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10847 1726776681.04404: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10847 1726776681.04484: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10847 1726776681.04521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10847 1726776681.04563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10847 1726776681.04594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10847 1726776681.04670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10847 1726776681.04699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10847 1726776681.04726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10847 1726776681.04772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10847 1726776681.04786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10847 1726776681.04891: variable '__kernel_settings_is_transactional' from source: set_fact 10847 1726776681.04910: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10847 1726776681.04914: when evaluation is False, skipping this task 10847 1726776681.04918: _execute() done 10847 1726776681.04922: dumping result to json 10847 1726776681.04925: done dumping result, returning 10847 1726776681.04934: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-000000000611] 10847 1726776681.04941: sending task result for task 120fa90a-8a95-cec2-986e-000000000611 10847 1726776681.04974: done sending task result for task 120fa90a-8a95-cec2-986e-000000000611 10847 1726776681.04978: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8218 1726776681.05353: no more pending results, returning what we have 8218 1726776681.05359: results queue empty 8218 1726776681.05360: checking for any_errors_fatal 8218 1726776681.05370: done checking for any_errors_fatal 8218 1726776681.05371: checking for max_fail_percentage 8218 1726776681.05373: done checking for max_fail_percentage 8218 1726776681.05374: checking to see if all hosts have failed and the running result is not ok 8218 1726776681.05374: done checking to see if all hosts have failed 8218 1726776681.05375: getting the remaining hosts for this loop 8218 1726776681.05376: done getting the remaining hosts for this loop 8218 1726776681.05379: getting the next task for host managed_node2 8218 1726776681.05385: done getting next task for host managed_node2 8218 1726776681.05389: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8218 1726776681.05391: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776681.05406: getting variables 8218 1726776681.05407: in VariableManager get_vars() 8218 1726776681.05443: Calling all_inventory to load vars for managed_node2 8218 1726776681.05446: Calling groups_inventory to load vars for managed_node2 8218 1726776681.05448: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776681.05459: Calling all_plugins_play to load vars for managed_node2 8218 1726776681.05463: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776681.05466: Calling groups_plugins_play to load vars for managed_node2 8218 1726776681.05641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776681.05852: done with get_vars() 8218 1726776681.05866: done getting variables 8218 1726776681.05923: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:11:21 -0400 (0:00:00.050) 0:01:06.890 **** 8218 1726776681.05960: entering _queue_task() for managed_node2/reboot 8218 1726776681.06153: worker is 1 (out of 1 available) 8218 1726776681.06169: exiting _queue_task() for managed_node2/reboot 8218 1726776681.06181: done queuing things up, now waiting for results queue to drain 8218 1726776681.06183: waiting for pending results... 10849 1726776681.06407: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10849 1726776681.06532: in run() - task 120fa90a-8a95-cec2-986e-000000000612 10849 1726776681.06548: variable 'ansible_search_path' from source: unknown 10849 1726776681.06552: variable 'ansible_search_path' from source: unknown 10849 1726776681.06582: calling self._execute() 10849 1726776681.06654: variable 'ansible_host' from source: host vars for 'managed_node2' 10849 1726776681.06665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10849 1726776681.06672: variable 'omit' from source: magic vars 10849 1726776681.07084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10849 1726776681.09277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10849 1726776681.09325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10849 1726776681.09355: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10849 1726776681.09383: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10849 1726776681.09403: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10849 1726776681.09458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10849 1726776681.09490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10849 1726776681.09509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10849 1726776681.09538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10849 1726776681.09549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10849 1726776681.09622: variable '__kernel_settings_is_transactional' from source: set_fact 10849 1726776681.09639: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10849 1726776681.09644: when evaluation is False, skipping this task 10849 1726776681.09648: _execute() done 10849 1726776681.09652: dumping result to json 10849 1726776681.09656: done dumping result, returning 10849 1726776681.09662: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-cec2-986e-000000000612] 10849 1726776681.09668: sending task result for task 120fa90a-8a95-cec2-986e-000000000612 10849 1726776681.09689: done sending task result for task 120fa90a-8a95-cec2-986e-000000000612 10849 1726776681.09693: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776681.09795: no more pending results, returning what we have 8218 1726776681.09798: results queue empty 8218 1726776681.09798: checking for any_errors_fatal 8218 1726776681.09804: done checking for any_errors_fatal 8218 1726776681.09805: checking for max_fail_percentage 8218 1726776681.09806: done checking for max_fail_percentage 8218 1726776681.09807: checking to see if all hosts have failed and the running result is not ok 8218 1726776681.09807: done checking to see if all hosts have failed 8218 1726776681.09808: getting the remaining hosts for this loop 8218 1726776681.09809: done getting the remaining hosts for this loop 8218 1726776681.09812: getting the next task for host managed_node2 8218 1726776681.09818: done getting next task for host managed_node2 8218 1726776681.09821: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8218 1726776681.09823: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776681.09839: getting variables 8218 1726776681.09840: in VariableManager get_vars() 8218 1726776681.09872: Calling all_inventory to load vars for managed_node2 8218 1726776681.09875: Calling groups_inventory to load vars for managed_node2 8218 1726776681.09877: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776681.09884: Calling all_plugins_play to load vars for managed_node2 8218 1726776681.09886: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776681.09889: Calling groups_plugins_play to load vars for managed_node2 8218 1726776681.10042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776681.10161: done with get_vars() 8218 1726776681.10168: done getting variables 8218 1726776681.10207: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:11:21 -0400 (0:00:00.042) 0:01:06.932 **** 8218 1726776681.10230: entering _queue_task() for managed_node2/fail 8218 1726776681.10381: worker is 1 (out of 1 available) 8218 1726776681.10395: exiting _queue_task() for managed_node2/fail 8218 1726776681.10406: done queuing things up, now waiting for results queue to drain 8218 1726776681.10407: waiting for pending results... 10851 1726776681.10531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10851 1726776681.10634: in run() - task 120fa90a-8a95-cec2-986e-000000000613 10851 1726776681.10651: variable 'ansible_search_path' from source: unknown 10851 1726776681.10655: variable 'ansible_search_path' from source: unknown 10851 1726776681.10682: calling self._execute() 10851 1726776681.10744: variable 'ansible_host' from source: host vars for 'managed_node2' 10851 1726776681.10751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10851 1726776681.10757: variable 'omit' from source: magic vars 10851 1726776681.11125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10851 1726776681.12811: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10851 1726776681.12867: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10851 1726776681.12895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10851 1726776681.12921: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10851 1726776681.12944: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10851 1726776681.12997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10851 1726776681.13019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10851 1726776681.13039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10851 1726776681.13069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10851 1726776681.13081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10851 1726776681.13155: variable '__kernel_settings_is_transactional' from source: set_fact 10851 1726776681.13170: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10851 1726776681.13174: when evaluation is False, skipping this task 10851 1726776681.13176: _execute() done 10851 1726776681.13178: dumping result to json 10851 1726776681.13180: done dumping result, returning 10851 1726776681.13184: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-cec2-986e-000000000613] 10851 1726776681.13188: sending task result for task 120fa90a-8a95-cec2-986e-000000000613 10851 1726776681.13206: done sending task result for task 120fa90a-8a95-cec2-986e-000000000613 10851 1726776681.13208: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776681.13331: no more pending results, returning what we have 8218 1726776681.13335: results queue empty 8218 1726776681.13335: checking for any_errors_fatal 8218 1726776681.13343: done checking for any_errors_fatal 8218 1726776681.13344: checking for max_fail_percentage 8218 1726776681.13345: done checking for max_fail_percentage 8218 1726776681.13346: checking to see if all hosts have failed and the running result is not ok 8218 1726776681.13346: done checking to see if all hosts have failed 8218 1726776681.13347: getting the remaining hosts for this loop 8218 1726776681.13348: done getting the remaining hosts for this loop 8218 1726776681.13351: getting the next task for host managed_node2 8218 1726776681.13360: done getting next task for host managed_node2 8218 1726776681.13363: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8218 1726776681.13365: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776681.13381: getting variables 8218 1726776681.13382: in VariableManager get_vars() 8218 1726776681.13413: Calling all_inventory to load vars for managed_node2 8218 1726776681.13416: Calling groups_inventory to load vars for managed_node2 8218 1726776681.13418: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776681.13426: Calling all_plugins_play to load vars for managed_node2 8218 1726776681.13430: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776681.13433: Calling groups_plugins_play to load vars for managed_node2 8218 1726776681.13551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776681.13670: done with get_vars() 8218 1726776681.13678: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:11:21 -0400 (0:00:00.035) 0:01:06.967 **** 8218 1726776681.13736: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776681.13884: worker is 1 (out of 1 available) 8218 1726776681.13898: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776681.13910: done queuing things up, now waiting for results queue to drain 8218 1726776681.13912: waiting for pending results... 10853 1726776681.14038: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10853 1726776681.14145: in run() - task 120fa90a-8a95-cec2-986e-000000000615 10853 1726776681.14161: variable 'ansible_search_path' from source: unknown 10853 1726776681.14165: variable 'ansible_search_path' from source: unknown 10853 1726776681.14191: calling self._execute() 10853 1726776681.14268: variable 'ansible_host' from source: host vars for 'managed_node2' 10853 1726776681.14278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10853 1726776681.14287: variable 'omit' from source: magic vars 10853 1726776681.14387: variable 'omit' from source: magic vars 10853 1726776681.14432: variable 'omit' from source: magic vars 10853 1726776681.14458: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10853 1726776681.14732: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10853 1726776681.14813: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10853 1726776681.14855: variable 'omit' from source: magic vars 10853 1726776681.14892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10853 1726776681.14924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10853 1726776681.14965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10853 1726776681.15054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10853 1726776681.15068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10853 1726776681.15094: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10853 1726776681.15100: variable 'ansible_host' from source: host vars for 'managed_node2' 10853 1726776681.15104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10853 1726776681.15187: Set connection var ansible_connection to ssh 10853 1726776681.15193: Set connection var ansible_pipelining to False 10853 1726776681.15197: Set connection var ansible_timeout to 10 10853 1726776681.15202: Set connection var ansible_module_compression to ZIP_DEFLATED 10853 1726776681.15205: Set connection var ansible_shell_type to sh 10853 1726776681.15208: Set connection var ansible_shell_executable to /bin/sh 10853 1726776681.15221: variable 'ansible_shell_executable' from source: unknown 10853 1726776681.15224: variable 'ansible_connection' from source: unknown 10853 1726776681.15225: variable 'ansible_module_compression' from source: unknown 10853 1726776681.15227: variable 'ansible_shell_type' from source: unknown 10853 1726776681.15231: variable 'ansible_shell_executable' from source: unknown 10853 1726776681.15234: variable 'ansible_host' from source: host vars for 'managed_node2' 10853 1726776681.15238: variable 'ansible_pipelining' from source: unknown 10853 1726776681.15241: variable 'ansible_timeout' from source: unknown 10853 1726776681.15245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10853 1726776681.15415: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10853 1726776681.15428: variable 'omit' from source: magic vars 10853 1726776681.15436: starting attempt loop 10853 1726776681.15439: running the handler 10853 1726776681.15452: _low_level_execute_command(): starting 10853 1726776681.15459: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10853 1726776681.17859: stdout chunk (state=2): >>>/root <<< 10853 1726776681.17985: stderr chunk (state=3): >>><<< 10853 1726776681.17992: stdout chunk (state=3): >>><<< 10853 1726776681.18010: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10853 1726776681.18023: _low_level_execute_command(): starting 10853 1726776681.18030: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081 `" && echo ansible-tmp-1726776681.180181-10853-70596341830081="` echo /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081 `" ) && sleep 0' 10853 1726776681.20842: stdout chunk (state=2): >>>ansible-tmp-1726776681.180181-10853-70596341830081=/root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081 <<< 10853 1726776681.21059: stderr chunk (state=3): >>><<< 10853 1726776681.21065: stdout chunk (state=3): >>><<< 10853 1726776681.21078: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776681.180181-10853-70596341830081=/root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081 , stderr= 10853 1726776681.21110: variable 'ansible_module_compression' from source: unknown 10853 1726776681.21160: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10853 1726776681.21194: variable 'ansible_facts' from source: unknown 10853 1726776681.21286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081/AnsiballZ_kernel_settings_get_config.py 10853 1726776681.21665: Sending initial data 10853 1726776681.21676: Sent initial data (172 bytes) 10853 1726776681.24135: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpu2_bc4m8 /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081/AnsiballZ_kernel_settings_get_config.py <<< 10853 1726776681.25168: stderr chunk (state=3): >>><<< 10853 1726776681.25174: stdout chunk (state=3): >>><<< 10853 1726776681.25190: done transferring module to remote 10853 1726776681.25200: _low_level_execute_command(): starting 10853 1726776681.25205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081/ /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10853 1726776681.27513: stderr chunk (state=2): >>><<< 10853 1726776681.27526: stdout chunk (state=2): >>><<< 10853 1726776681.27543: _low_level_execute_command() done: rc=0, stdout=, stderr= 10853 1726776681.27547: _low_level_execute_command(): starting 10853 1726776681.27550: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10853 1726776681.43278: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10853 1726776681.44323: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10853 1726776681.44372: stderr chunk (state=3): >>><<< 10853 1726776681.44379: stdout chunk (state=3): >>><<< 10853 1726776681.44397: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 10853 1726776681.44426: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10853 1726776681.44439: _low_level_execute_command(): starting 10853 1726776681.44445: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776681.180181-10853-70596341830081/ > /dev/null 2>&1 && sleep 0' 10853 1726776681.46869: stderr chunk (state=2): >>><<< 10853 1726776681.46876: stdout chunk (state=2): >>><<< 10853 1726776681.46891: _low_level_execute_command() done: rc=0, stdout=, stderr= 10853 1726776681.46903: handler run complete 10853 1726776681.46920: attempt loop complete, returning result 10853 1726776681.46924: _execute() done 10853 1726776681.46927: dumping result to json 10853 1726776681.46933: done dumping result, returning 10853 1726776681.46941: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-cec2-986e-000000000615] 10853 1726776681.46947: sending task result for task 120fa90a-8a95-cec2-986e-000000000615 10853 1726776681.46984: done sending task result for task 120fa90a-8a95-cec2-986e-000000000615 10853 1726776681.46988: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8218 1726776681.47136: no more pending results, returning what we have 8218 1726776681.47140: results queue empty 8218 1726776681.47141: checking for any_errors_fatal 8218 1726776681.47148: done checking for any_errors_fatal 8218 1726776681.47149: checking for max_fail_percentage 8218 1726776681.47150: done checking for max_fail_percentage 8218 1726776681.47151: checking to see if all hosts have failed and the running result is not ok 8218 1726776681.47152: done checking to see if all hosts have failed 8218 1726776681.47152: getting the remaining hosts for this loop 8218 1726776681.47153: done getting the remaining hosts for this loop 8218 1726776681.47156: getting the next task for host managed_node2 8218 1726776681.47161: done getting next task for host managed_node2 8218 1726776681.47163: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8218 1726776681.47166: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776681.47175: getting variables 8218 1726776681.47176: in VariableManager get_vars() 8218 1726776681.47210: Calling all_inventory to load vars for managed_node2 8218 1726776681.47212: Calling groups_inventory to load vars for managed_node2 8218 1726776681.47214: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776681.47222: Calling all_plugins_play to load vars for managed_node2 8218 1726776681.47225: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776681.47227: Calling groups_plugins_play to load vars for managed_node2 8218 1726776681.47384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776681.47500: done with get_vars() 8218 1726776681.47508: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:11:21 -0400 (0:00:00.338) 0:01:07.306 **** 8218 1726776681.47581: entering _queue_task() for managed_node2/stat 8218 1726776681.47740: worker is 1 (out of 1 available) 8218 1726776681.47755: exiting _queue_task() for managed_node2/stat 8218 1726776681.47765: done queuing things up, now waiting for results queue to drain 8218 1726776681.47767: waiting for pending results... 10865 1726776681.47901: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10865 1726776681.48015: in run() - task 120fa90a-8a95-cec2-986e-000000000616 10865 1726776681.48034: variable 'ansible_search_path' from source: unknown 10865 1726776681.48038: variable 'ansible_search_path' from source: unknown 10865 1726776681.48077: variable '__prof_from_conf' from source: task vars 10865 1726776681.48317: variable '__prof_from_conf' from source: task vars 10865 1726776681.48453: variable '__data' from source: task vars 10865 1726776681.48509: variable '__kernel_settings_register_tuned_main' from source: set_fact 10865 1726776681.48652: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10865 1726776681.48665: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10865 1726776681.48707: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10865 1726776681.48725: variable 'omit' from source: magic vars 10865 1726776681.48811: variable 'ansible_host' from source: host vars for 'managed_node2' 10865 1726776681.48822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10865 1726776681.48832: variable 'omit' from source: magic vars 10865 1726776681.49004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10865 1726776681.50530: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10865 1726776681.50577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10865 1726776681.50606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10865 1726776681.50635: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10865 1726776681.50655: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10865 1726776681.50711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10865 1726776681.50734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10865 1726776681.50753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10865 1726776681.50781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10865 1726776681.50794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10865 1726776681.50860: variable 'item' from source: unknown 10865 1726776681.50872: Evaluated conditional (item | length > 0): False 10865 1726776681.50877: when evaluation is False, skipping this task 10865 1726776681.50899: variable 'item' from source: unknown 10865 1726776681.50949: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 10865 1726776681.51022: variable 'ansible_host' from source: host vars for 'managed_node2' 10865 1726776681.51034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10865 1726776681.51044: variable 'omit' from source: magic vars 10865 1726776681.51174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10865 1726776681.51193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10865 1726776681.51210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10865 1726776681.51239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10865 1726776681.51251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10865 1726776681.51303: variable 'item' from source: unknown 10865 1726776681.51312: Evaluated conditional (item | length > 0): True 10865 1726776681.51319: variable 'omit' from source: magic vars 10865 1726776681.51349: variable 'omit' from source: magic vars 10865 1726776681.51379: variable 'item' from source: unknown 10865 1726776681.51423: variable 'item' from source: unknown 10865 1726776681.51439: variable 'omit' from source: magic vars 10865 1726776681.51460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10865 1726776681.51478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10865 1726776681.51490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10865 1726776681.51501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10865 1726776681.51509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10865 1726776681.51528: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10865 1726776681.51544: variable 'ansible_host' from source: host vars for 'managed_node2' 10865 1726776681.51548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10865 1726776681.51611: Set connection var ansible_connection to ssh 10865 1726776681.51619: Set connection var ansible_pipelining to False 10865 1726776681.51626: Set connection var ansible_timeout to 10 10865 1726776681.51634: Set connection var ansible_module_compression to ZIP_DEFLATED 10865 1726776681.51639: Set connection var ansible_shell_type to sh 10865 1726776681.51645: Set connection var ansible_shell_executable to /bin/sh 10865 1726776681.51660: variable 'ansible_shell_executable' from source: unknown 10865 1726776681.51665: variable 'ansible_connection' from source: unknown 10865 1726776681.51668: variable 'ansible_module_compression' from source: unknown 10865 1726776681.51671: variable 'ansible_shell_type' from source: unknown 10865 1726776681.51674: variable 'ansible_shell_executable' from source: unknown 10865 1726776681.51676: variable 'ansible_host' from source: host vars for 'managed_node2' 10865 1726776681.51678: variable 'ansible_pipelining' from source: unknown 10865 1726776681.51680: variable 'ansible_timeout' from source: unknown 10865 1726776681.51682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10865 1726776681.51777: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10865 1726776681.51787: variable 'omit' from source: magic vars 10865 1726776681.51791: starting attempt loop 10865 1726776681.51793: running the handler 10865 1726776681.51801: _low_level_execute_command(): starting 10865 1726776681.51806: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10865 1726776681.54172: stdout chunk (state=2): >>>/root <<< 10865 1726776681.54316: stderr chunk (state=3): >>><<< 10865 1726776681.54323: stdout chunk (state=3): >>><<< 10865 1726776681.54342: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10865 1726776681.54355: _low_level_execute_command(): starting 10865 1726776681.54362: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043 `" && echo ansible-tmp-1726776681.5434968-10865-39926181407043="` echo /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043 `" ) && sleep 0' 10865 1726776681.56918: stdout chunk (state=2): >>>ansible-tmp-1726776681.5434968-10865-39926181407043=/root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043 <<< 10865 1726776681.57050: stderr chunk (state=3): >>><<< 10865 1726776681.57059: stdout chunk (state=3): >>><<< 10865 1726776681.57076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776681.5434968-10865-39926181407043=/root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043 , stderr= 10865 1726776681.57110: variable 'ansible_module_compression' from source: unknown 10865 1726776681.57155: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10865 1726776681.57185: variable 'ansible_facts' from source: unknown 10865 1726776681.57251: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043/AnsiballZ_stat.py 10865 1726776681.57348: Sending initial data 10865 1726776681.57355: Sent initial data (151 bytes) 10865 1726776681.59875: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpra935ljs /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043/AnsiballZ_stat.py <<< 10865 1726776681.60945: stderr chunk (state=3): >>><<< 10865 1726776681.60953: stdout chunk (state=3): >>><<< 10865 1726776681.60973: done transferring module to remote 10865 1726776681.60984: _low_level_execute_command(): starting 10865 1726776681.60989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043/ /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043/AnsiballZ_stat.py && sleep 0' 10865 1726776681.63340: stderr chunk (state=2): >>><<< 10865 1726776681.63348: stdout chunk (state=2): >>><<< 10865 1726776681.63362: _low_level_execute_command() done: rc=0, stdout=, stderr= 10865 1726776681.63367: _low_level_execute_command(): starting 10865 1726776681.63372: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043/AnsiballZ_stat.py && sleep 0' 10865 1726776681.78311: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10865 1726776681.79345: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10865 1726776681.79393: stderr chunk (state=3): >>><<< 10865 1726776681.79401: stdout chunk (state=3): >>><<< 10865 1726776681.79417: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 10865 1726776681.79440: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10865 1726776681.79451: _low_level_execute_command(): starting 10865 1726776681.79457: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776681.5434968-10865-39926181407043/ > /dev/null 2>&1 && sleep 0' 10865 1726776681.81881: stderr chunk (state=2): >>><<< 10865 1726776681.81891: stdout chunk (state=2): >>><<< 10865 1726776681.81905: _low_level_execute_command() done: rc=0, stdout=, stderr= 10865 1726776681.81912: handler run complete 10865 1726776681.81927: attempt loop complete, returning result 10865 1726776681.81945: variable 'item' from source: unknown 10865 1726776681.82006: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10865 1726776681.82093: variable 'ansible_host' from source: host vars for 'managed_node2' 10865 1726776681.82103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10865 1726776681.82113: variable 'omit' from source: magic vars 10865 1726776681.82221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10865 1726776681.82245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10865 1726776681.82265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10865 1726776681.82294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10865 1726776681.82313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10865 1726776681.82373: variable 'item' from source: unknown 10865 1726776681.82382: Evaluated conditional (item | length > 0): True 10865 1726776681.82387: variable 'omit' from source: magic vars 10865 1726776681.82399: variable 'omit' from source: magic vars 10865 1726776681.82425: variable 'item' from source: unknown 10865 1726776681.82474: variable 'item' from source: unknown 10865 1726776681.82488: variable 'omit' from source: magic vars 10865 1726776681.82504: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10865 1726776681.82512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10865 1726776681.82519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10865 1726776681.82532: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10865 1726776681.82536: variable 'ansible_host' from source: host vars for 'managed_node2' 10865 1726776681.82540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10865 1726776681.82590: Set connection var ansible_connection to ssh 10865 1726776681.82597: Set connection var ansible_pipelining to False 10865 1726776681.82603: Set connection var ansible_timeout to 10 10865 1726776681.82610: Set connection var ansible_module_compression to ZIP_DEFLATED 10865 1726776681.82616: Set connection var ansible_shell_type to sh 10865 1726776681.82621: Set connection var ansible_shell_executable to /bin/sh 10865 1726776681.82636: variable 'ansible_shell_executable' from source: unknown 10865 1726776681.82640: variable 'ansible_connection' from source: unknown 10865 1726776681.82643: variable 'ansible_module_compression' from source: unknown 10865 1726776681.82646: variable 'ansible_shell_type' from source: unknown 10865 1726776681.82650: variable 'ansible_shell_executable' from source: unknown 10865 1726776681.82653: variable 'ansible_host' from source: host vars for 'managed_node2' 10865 1726776681.82657: variable 'ansible_pipelining' from source: unknown 10865 1726776681.82660: variable 'ansible_timeout' from source: unknown 10865 1726776681.82664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10865 1726776681.82726: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10865 1726776681.82737: variable 'omit' from source: magic vars 10865 1726776681.82743: starting attempt loop 10865 1726776681.82746: running the handler 10865 1726776681.82753: _low_level_execute_command(): starting 10865 1726776681.82757: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10865 1726776681.84974: stdout chunk (state=2): >>>/root <<< 10865 1726776681.85144: stderr chunk (state=3): >>><<< 10865 1726776681.85149: stdout chunk (state=3): >>><<< 10865 1726776681.85163: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10865 1726776681.85172: _low_level_execute_command(): starting 10865 1726776681.85176: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147 `" && echo ansible-tmp-1726776681.8516889-10865-129253726099147="` echo /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147 `" ) && sleep 0' 10865 1726776681.87769: stdout chunk (state=2): >>>ansible-tmp-1726776681.8516889-10865-129253726099147=/root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147 <<< 10865 1726776681.87946: stderr chunk (state=3): >>><<< 10865 1726776681.87952: stdout chunk (state=3): >>><<< 10865 1726776681.87965: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776681.8516889-10865-129253726099147=/root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147 , stderr= 10865 1726776681.87990: variable 'ansible_module_compression' from source: unknown 10865 1726776681.88024: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10865 1726776681.88041: variable 'ansible_facts' from source: unknown 10865 1726776681.88095: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147/AnsiballZ_stat.py 10865 1726776681.88178: Sending initial data 10865 1726776681.88185: Sent initial data (152 bytes) 10865 1726776681.90642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpgs9e9np1 /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147/AnsiballZ_stat.py <<< 10865 1726776681.91689: stderr chunk (state=3): >>><<< 10865 1726776681.91696: stdout chunk (state=3): >>><<< 10865 1726776681.91712: done transferring module to remote 10865 1726776681.91720: _low_level_execute_command(): starting 10865 1726776681.91725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147/ /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147/AnsiballZ_stat.py && sleep 0' 10865 1726776681.94002: stderr chunk (state=2): >>><<< 10865 1726776681.94009: stdout chunk (state=2): >>><<< 10865 1726776681.94022: _low_level_execute_command() done: rc=0, stdout=, stderr= 10865 1726776681.94026: _low_level_execute_command(): starting 10865 1726776681.94032: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147/AnsiballZ_stat.py && sleep 0' 10865 1726776682.10301: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10865 1726776682.11491: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10865 1726776682.11541: stderr chunk (state=3): >>><<< 10865 1726776682.11548: stdout chunk (state=3): >>><<< 10865 1726776682.11564: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 10865 1726776682.11597: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10865 1726776682.11606: _low_level_execute_command(): starting 10865 1726776682.11611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776681.8516889-10865-129253726099147/ > /dev/null 2>&1 && sleep 0' 10865 1726776682.14037: stderr chunk (state=2): >>><<< 10865 1726776682.14045: stdout chunk (state=2): >>><<< 10865 1726776682.14061: _low_level_execute_command() done: rc=0, stdout=, stderr= 10865 1726776682.14067: handler run complete 10865 1726776682.14098: attempt loop complete, returning result 10865 1726776682.14116: variable 'item' from source: unknown 10865 1726776682.14179: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776634.1489303, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726776632.1399238, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726776632.1399238, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10865 1726776682.14222: dumping result to json 10865 1726776682.14234: done dumping result, returning 10865 1726776682.14243: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-cec2-986e-000000000616] 10865 1726776682.14249: sending task result for task 120fa90a-8a95-cec2-986e-000000000616 10865 1726776682.14291: done sending task result for task 120fa90a-8a95-cec2-986e-000000000616 10865 1726776682.14295: WORKER PROCESS EXITING 8218 1726776682.14472: no more pending results, returning what we have 8218 1726776682.14476: results queue empty 8218 1726776682.14476: checking for any_errors_fatal 8218 1726776682.14481: done checking for any_errors_fatal 8218 1726776682.14482: checking for max_fail_percentage 8218 1726776682.14483: done checking for max_fail_percentage 8218 1726776682.14484: checking to see if all hosts have failed and the running result is not ok 8218 1726776682.14485: done checking to see if all hosts have failed 8218 1726776682.14485: getting the remaining hosts for this loop 8218 1726776682.14486: done getting the remaining hosts for this loop 8218 1726776682.14489: getting the next task for host managed_node2 8218 1726776682.14494: done getting next task for host managed_node2 8218 1726776682.14497: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8218 1726776682.14501: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776682.14510: getting variables 8218 1726776682.14511: in VariableManager get_vars() 8218 1726776682.14545: Calling all_inventory to load vars for managed_node2 8218 1726776682.14548: Calling groups_inventory to load vars for managed_node2 8218 1726776682.14549: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776682.14557: Calling all_plugins_play to load vars for managed_node2 8218 1726776682.14560: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776682.14562: Calling groups_plugins_play to load vars for managed_node2 8218 1726776682.14673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776682.14791: done with get_vars() 8218 1726776682.14801: done getting variables 8218 1726776682.14847: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:11:22 -0400 (0:00:00.672) 0:01:07.979 **** 8218 1726776682.14870: entering _queue_task() for managed_node2/set_fact 8218 1726776682.15065: worker is 1 (out of 1 available) 8218 1726776682.15080: exiting _queue_task() for managed_node2/set_fact 8218 1726776682.15092: done queuing things up, now waiting for results queue to drain 8218 1726776682.15093: waiting for pending results... 10883 1726776682.15228: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 10883 1726776682.15359: in run() - task 120fa90a-8a95-cec2-986e-000000000617 10883 1726776682.15376: variable 'ansible_search_path' from source: unknown 10883 1726776682.15380: variable 'ansible_search_path' from source: unknown 10883 1726776682.15408: calling self._execute() 10883 1726776682.15477: variable 'ansible_host' from source: host vars for 'managed_node2' 10883 1726776682.15486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10883 1726776682.15494: variable 'omit' from source: magic vars 10883 1726776682.15571: variable 'omit' from source: magic vars 10883 1726776682.15603: variable 'omit' from source: magic vars 10883 1726776682.15922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10883 1726776682.17432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10883 1726776682.17486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10883 1726776682.17515: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10883 1726776682.17543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10883 1726776682.17566: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10883 1726776682.17620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10883 1726776682.17642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10883 1726776682.17663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10883 1726776682.17689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10883 1726776682.17701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10883 1726776682.17736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10883 1726776682.17753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10883 1726776682.17772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10883 1726776682.17796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10883 1726776682.17808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10883 1726776682.17850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10883 1726776682.17869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10883 1726776682.17886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10883 1726776682.17911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10883 1726776682.17921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10883 1726776682.18067: variable '__kernel_settings_find_profile_dirs' from source: set_fact 10883 1726776682.18134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10883 1726776682.18239: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10883 1726776682.18268: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10883 1726776682.18288: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10883 1726776682.18306: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10883 1726776682.18345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10883 1726776682.18364: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10883 1726776682.18379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10883 1726776682.18393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10883 1726776682.18425: variable 'omit' from source: magic vars 10883 1726776682.18445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10883 1726776682.18466: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10883 1726776682.18480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10883 1726776682.18491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10883 1726776682.18498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10883 1726776682.18517: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10883 1726776682.18521: variable 'ansible_host' from source: host vars for 'managed_node2' 10883 1726776682.18523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10883 1726776682.18597: Set connection var ansible_connection to ssh 10883 1726776682.18605: Set connection var ansible_pipelining to False 10883 1726776682.18611: Set connection var ansible_timeout to 10 10883 1726776682.18619: Set connection var ansible_module_compression to ZIP_DEFLATED 10883 1726776682.18624: Set connection var ansible_shell_type to sh 10883 1726776682.18631: Set connection var ansible_shell_executable to /bin/sh 10883 1726776682.18646: variable 'ansible_shell_executable' from source: unknown 10883 1726776682.18650: variable 'ansible_connection' from source: unknown 10883 1726776682.18654: variable 'ansible_module_compression' from source: unknown 10883 1726776682.18660: variable 'ansible_shell_type' from source: unknown 10883 1726776682.18664: variable 'ansible_shell_executable' from source: unknown 10883 1726776682.18667: variable 'ansible_host' from source: host vars for 'managed_node2' 10883 1726776682.18671: variable 'ansible_pipelining' from source: unknown 10883 1726776682.18675: variable 'ansible_timeout' from source: unknown 10883 1726776682.18678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10883 1726776682.18741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10883 1726776682.18751: variable 'omit' from source: magic vars 10883 1726776682.18758: starting attempt loop 10883 1726776682.18762: running the handler 10883 1726776682.18771: handler run complete 10883 1726776682.18779: attempt loop complete, returning result 10883 1726776682.18782: _execute() done 10883 1726776682.18785: dumping result to json 10883 1726776682.18787: done dumping result, returning 10883 1726776682.18791: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-cec2-986e-000000000617] 10883 1726776682.18796: sending task result for task 120fa90a-8a95-cec2-986e-000000000617 10883 1726776682.18812: done sending task result for task 120fa90a-8a95-cec2-986e-000000000617 10883 1726776682.18814: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8218 1726776682.19044: no more pending results, returning what we have 8218 1726776682.19047: results queue empty 8218 1726776682.19047: checking for any_errors_fatal 8218 1726776682.19055: done checking for any_errors_fatal 8218 1726776682.19055: checking for max_fail_percentage 8218 1726776682.19058: done checking for max_fail_percentage 8218 1726776682.19059: checking to see if all hosts have failed and the running result is not ok 8218 1726776682.19059: done checking to see if all hosts have failed 8218 1726776682.19060: getting the remaining hosts for this loop 8218 1726776682.19060: done getting the remaining hosts for this loop 8218 1726776682.19062: getting the next task for host managed_node2 8218 1726776682.19066: done getting next task for host managed_node2 8218 1726776682.19068: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8218 1726776682.19070: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776682.19076: getting variables 8218 1726776682.19077: in VariableManager get_vars() 8218 1726776682.19101: Calling all_inventory to load vars for managed_node2 8218 1726776682.19103: Calling groups_inventory to load vars for managed_node2 8218 1726776682.19104: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776682.19110: Calling all_plugins_play to load vars for managed_node2 8218 1726776682.19112: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776682.19113: Calling groups_plugins_play to load vars for managed_node2 8218 1726776682.19219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776682.19344: done with get_vars() 8218 1726776682.19352: done getting variables 8218 1726776682.19394: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:11:22 -0400 (0:00:00.045) 0:01:08.024 **** 8218 1726776682.19416: entering _queue_task() for managed_node2/service 8218 1726776682.19572: worker is 1 (out of 1 available) 8218 1726776682.19586: exiting _queue_task() for managed_node2/service 8218 1726776682.19598: done queuing things up, now waiting for results queue to drain 8218 1726776682.19599: waiting for pending results... 10884 1726776682.19720: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 10884 1726776682.19836: in run() - task 120fa90a-8a95-cec2-986e-000000000618 10884 1726776682.19852: variable 'ansible_search_path' from source: unknown 10884 1726776682.19856: variable 'ansible_search_path' from source: unknown 10884 1726776682.19888: variable '__kernel_settings_services' from source: include_vars 10884 1726776682.20110: variable '__kernel_settings_services' from source: include_vars 10884 1726776682.20166: variable 'omit' from source: magic vars 10884 1726776682.20248: variable 'ansible_host' from source: host vars for 'managed_node2' 10884 1726776682.20259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10884 1726776682.20268: variable 'omit' from source: magic vars 10884 1726776682.20317: variable 'omit' from source: magic vars 10884 1726776682.20345: variable 'omit' from source: magic vars 10884 1726776682.20377: variable 'item' from source: unknown 10884 1726776682.20431: variable 'item' from source: unknown 10884 1726776682.20450: variable 'omit' from source: magic vars 10884 1726776682.20479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10884 1726776682.20504: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10884 1726776682.20522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10884 1726776682.20536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10884 1726776682.20547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10884 1726776682.20569: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10884 1726776682.20574: variable 'ansible_host' from source: host vars for 'managed_node2' 10884 1726776682.20579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10884 1726776682.20644: Set connection var ansible_connection to ssh 10884 1726776682.20652: Set connection var ansible_pipelining to False 10884 1726776682.20659: Set connection var ansible_timeout to 10 10884 1726776682.20666: Set connection var ansible_module_compression to ZIP_DEFLATED 10884 1726776682.20671: Set connection var ansible_shell_type to sh 10884 1726776682.20676: Set connection var ansible_shell_executable to /bin/sh 10884 1726776682.20689: variable 'ansible_shell_executable' from source: unknown 10884 1726776682.20693: variable 'ansible_connection' from source: unknown 10884 1726776682.20696: variable 'ansible_module_compression' from source: unknown 10884 1726776682.20699: variable 'ansible_shell_type' from source: unknown 10884 1726776682.20703: variable 'ansible_shell_executable' from source: unknown 10884 1726776682.20706: variable 'ansible_host' from source: host vars for 'managed_node2' 10884 1726776682.20710: variable 'ansible_pipelining' from source: unknown 10884 1726776682.20713: variable 'ansible_timeout' from source: unknown 10884 1726776682.20717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10884 1726776682.20806: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10884 1726776682.20818: variable 'omit' from source: magic vars 10884 1726776682.20825: starting attempt loop 10884 1726776682.20830: running the handler 10884 1726776682.20915: variable 'ansible_facts' from source: unknown 10884 1726776682.20995: _low_level_execute_command(): starting 10884 1726776682.21004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10884 1726776682.23392: stdout chunk (state=2): >>>/root <<< 10884 1726776682.23512: stderr chunk (state=3): >>><<< 10884 1726776682.23518: stdout chunk (state=3): >>><<< 10884 1726776682.23538: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10884 1726776682.23550: _low_level_execute_command(): starting 10884 1726776682.23555: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978 `" && echo ansible-tmp-1726776682.2354512-10884-74696835987978="` echo /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978 `" ) && sleep 0' 10884 1726776682.26258: stdout chunk (state=2): >>>ansible-tmp-1726776682.2354512-10884-74696835987978=/root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978 <<< 10884 1726776682.26407: stderr chunk (state=3): >>><<< 10884 1726776682.26413: stdout chunk (state=3): >>><<< 10884 1726776682.26427: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776682.2354512-10884-74696835987978=/root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978 , stderr= 10884 1726776682.26452: variable 'ansible_module_compression' from source: unknown 10884 1726776682.26490: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10884 1726776682.26537: variable 'ansible_facts' from source: unknown 10884 1726776682.26692: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978/AnsiballZ_systemd.py 10884 1726776682.26793: Sending initial data 10884 1726776682.26800: Sent initial data (154 bytes) 10884 1726776682.29292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmprf3543ka /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978/AnsiballZ_systemd.py <<< 10884 1726776682.31265: stderr chunk (state=3): >>><<< 10884 1726776682.31276: stdout chunk (state=3): >>><<< 10884 1726776682.31297: done transferring module to remote 10884 1726776682.31309: _low_level_execute_command(): starting 10884 1726776682.31314: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978/ /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978/AnsiballZ_systemd.py && sleep 0' 10884 1726776682.33716: stderr chunk (state=2): >>><<< 10884 1726776682.33723: stdout chunk (state=2): >>><<< 10884 1726776682.33738: _low_level_execute_command() done: rc=0, stdout=, stderr= 10884 1726776682.33742: _low_level_execute_command(): starting 10884 1726776682.33747: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978/AnsiballZ_systemd.py && sleep 0' 10884 1726776682.61544: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "22814720", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 10884 1726776682.61600: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10884 1726776682.63138: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10884 1726776682.63183: stderr chunk (state=3): >>><<< 10884 1726776682.63191: stdout chunk (state=3): >>><<< 10884 1726776682.63210: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "22814720", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10884 1726776682.63312: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10884 1726776682.63332: _low_level_execute_command(): starting 10884 1726776682.63339: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776682.2354512-10884-74696835987978/ > /dev/null 2>&1 && sleep 0' 10884 1726776682.65721: stderr chunk (state=2): >>><<< 10884 1726776682.65730: stdout chunk (state=2): >>><<< 10884 1726776682.65744: _low_level_execute_command() done: rc=0, stdout=, stderr= 10884 1726776682.65752: handler run complete 10884 1726776682.65785: attempt loop complete, returning result 10884 1726776682.65803: variable 'item' from source: unknown 10884 1726776682.65867: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "659", "MemoryAccounting": "yes", "MemoryCurrent": "22814720", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "WatchdogUSec": "0" } } 10884 1726776682.65965: dumping result to json 10884 1726776682.65983: done dumping result, returning 10884 1726776682.65992: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-cec2-986e-000000000618] 10884 1726776682.65999: sending task result for task 120fa90a-8a95-cec2-986e-000000000618 10884 1726776682.66103: done sending task result for task 120fa90a-8a95-cec2-986e-000000000618 10884 1726776682.66108: WORKER PROCESS EXITING 8218 1726776682.66450: no more pending results, returning what we have 8218 1726776682.66453: results queue empty 8218 1726776682.66453: checking for any_errors_fatal 8218 1726776682.66459: done checking for any_errors_fatal 8218 1726776682.66460: checking for max_fail_percentage 8218 1726776682.66461: done checking for max_fail_percentage 8218 1726776682.66461: checking to see if all hosts have failed and the running result is not ok 8218 1726776682.66462: done checking to see if all hosts have failed 8218 1726776682.66462: getting the remaining hosts for this loop 8218 1726776682.66463: done getting the remaining hosts for this loop 8218 1726776682.66465: getting the next task for host managed_node2 8218 1726776682.66469: done getting next task for host managed_node2 8218 1726776682.66471: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8218 1726776682.66473: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776682.66479: getting variables 8218 1726776682.66480: in VariableManager get_vars() 8218 1726776682.66502: Calling all_inventory to load vars for managed_node2 8218 1726776682.66503: Calling groups_inventory to load vars for managed_node2 8218 1726776682.66505: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776682.66511: Calling all_plugins_play to load vars for managed_node2 8218 1726776682.66512: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776682.66514: Calling groups_plugins_play to load vars for managed_node2 8218 1726776682.66630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776682.66749: done with get_vars() 8218 1726776682.66760: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:11:22 -0400 (0:00:00.474) 0:01:08.498 **** 8218 1726776682.66825: entering _queue_task() for managed_node2/file 8218 1726776682.66984: worker is 1 (out of 1 available) 8218 1726776682.66999: exiting _queue_task() for managed_node2/file 8218 1726776682.67010: done queuing things up, now waiting for results queue to drain 8218 1726776682.67012: waiting for pending results... 10897 1726776682.67150: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 10897 1726776682.67275: in run() - task 120fa90a-8a95-cec2-986e-000000000619 10897 1726776682.67291: variable 'ansible_search_path' from source: unknown 10897 1726776682.67294: variable 'ansible_search_path' from source: unknown 10897 1726776682.67321: calling self._execute() 10897 1726776682.67387: variable 'ansible_host' from source: host vars for 'managed_node2' 10897 1726776682.67396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10897 1726776682.67405: variable 'omit' from source: magic vars 10897 1726776682.67492: variable 'omit' from source: magic vars 10897 1726776682.67538: variable 'omit' from source: magic vars 10897 1726776682.67563: variable '__kernel_settings_profile_dir' from source: role '' all vars 10897 1726776682.67842: variable '__kernel_settings_profile_dir' from source: role '' all vars 10897 1726776682.67921: variable '__kernel_settings_profile_parent' from source: set_fact 10897 1726776682.67930: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10897 1726776682.67972: variable 'omit' from source: magic vars 10897 1726776682.68018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10897 1726776682.68052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10897 1726776682.68072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10897 1726776682.68090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10897 1726776682.68103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10897 1726776682.68131: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10897 1726776682.68138: variable 'ansible_host' from source: host vars for 'managed_node2' 10897 1726776682.68142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10897 1726776682.68232: Set connection var ansible_connection to ssh 10897 1726776682.68241: Set connection var ansible_pipelining to False 10897 1726776682.68246: Set connection var ansible_timeout to 10 10897 1726776682.68254: Set connection var ansible_module_compression to ZIP_DEFLATED 10897 1726776682.68260: Set connection var ansible_shell_type to sh 10897 1726776682.68265: Set connection var ansible_shell_executable to /bin/sh 10897 1726776682.68283: variable 'ansible_shell_executable' from source: unknown 10897 1726776682.68288: variable 'ansible_connection' from source: unknown 10897 1726776682.68291: variable 'ansible_module_compression' from source: unknown 10897 1726776682.68295: variable 'ansible_shell_type' from source: unknown 10897 1726776682.68297: variable 'ansible_shell_executable' from source: unknown 10897 1726776682.68300: variable 'ansible_host' from source: host vars for 'managed_node2' 10897 1726776682.68303: variable 'ansible_pipelining' from source: unknown 10897 1726776682.68306: variable 'ansible_timeout' from source: unknown 10897 1726776682.68310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10897 1726776682.68448: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10897 1726776682.68458: variable 'omit' from source: magic vars 10897 1726776682.68463: starting attempt loop 10897 1726776682.68465: running the handler 10897 1726776682.68474: _low_level_execute_command(): starting 10897 1726776682.68479: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10897 1726776682.70898: stdout chunk (state=2): >>>/root <<< 10897 1726776682.71017: stderr chunk (state=3): >>><<< 10897 1726776682.71028: stdout chunk (state=3): >>><<< 10897 1726776682.71045: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10897 1726776682.71062: _low_level_execute_command(): starting 10897 1726776682.71068: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882 `" && echo ansible-tmp-1726776682.710561-10897-218044459558882="` echo /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882 `" ) && sleep 0' 10897 1726776682.74003: stdout chunk (state=2): >>>ansible-tmp-1726776682.710561-10897-218044459558882=/root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882 <<< 10897 1726776682.74130: stderr chunk (state=3): >>><<< 10897 1726776682.74137: stdout chunk (state=3): >>><<< 10897 1726776682.74151: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776682.710561-10897-218044459558882=/root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882 , stderr= 10897 1726776682.74186: variable 'ansible_module_compression' from source: unknown 10897 1726776682.74231: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10897 1726776682.74261: variable 'ansible_facts' from source: unknown 10897 1726776682.74327: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882/AnsiballZ_file.py 10897 1726776682.74421: Sending initial data 10897 1726776682.74428: Sent initial data (151 bytes) 10897 1726776682.76953: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpe867fkrl /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882/AnsiballZ_file.py <<< 10897 1726776682.78027: stderr chunk (state=3): >>><<< 10897 1726776682.78034: stdout chunk (state=3): >>><<< 10897 1726776682.78052: done transferring module to remote 10897 1726776682.78064: _low_level_execute_command(): starting 10897 1726776682.78069: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882/ /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882/AnsiballZ_file.py && sleep 0' 10897 1726776682.80380: stderr chunk (state=2): >>><<< 10897 1726776682.80388: stdout chunk (state=2): >>><<< 10897 1726776682.80400: _low_level_execute_command() done: rc=0, stdout=, stderr= 10897 1726776682.80404: _low_level_execute_command(): starting 10897 1726776682.80409: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882/AnsiballZ_file.py && sleep 0' 10897 1726776682.96700: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10897 1726776682.97823: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10897 1726776682.97876: stderr chunk (state=3): >>><<< 10897 1726776682.97886: stdout chunk (state=3): >>><<< 10897 1726776682.97902: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10897 1726776682.97935: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10897 1726776682.97946: _low_level_execute_command(): starting 10897 1726776682.97952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776682.710561-10897-218044459558882/ > /dev/null 2>&1 && sleep 0' 10897 1726776683.00380: stderr chunk (state=2): >>><<< 10897 1726776683.00387: stdout chunk (state=2): >>><<< 10897 1726776683.00400: _low_level_execute_command() done: rc=0, stdout=, stderr= 10897 1726776683.00406: handler run complete 10897 1726776683.00424: attempt loop complete, returning result 10897 1726776683.00428: _execute() done 10897 1726776683.00433: dumping result to json 10897 1726776683.00436: done dumping result, returning 10897 1726776683.00453: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-cec2-986e-000000000619] 10897 1726776683.00462: sending task result for task 120fa90a-8a95-cec2-986e-000000000619 10897 1726776683.00498: done sending task result for task 120fa90a-8a95-cec2-986e-000000000619 10897 1726776683.00500: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8218 1726776683.00699: no more pending results, returning what we have 8218 1726776683.00703: results queue empty 8218 1726776683.00703: checking for any_errors_fatal 8218 1726776683.00716: done checking for any_errors_fatal 8218 1726776683.00717: checking for max_fail_percentage 8218 1726776683.00718: done checking for max_fail_percentage 8218 1726776683.00719: checking to see if all hosts have failed and the running result is not ok 8218 1726776683.00720: done checking to see if all hosts have failed 8218 1726776683.00720: getting the remaining hosts for this loop 8218 1726776683.00721: done getting the remaining hosts for this loop 8218 1726776683.00724: getting the next task for host managed_node2 8218 1726776683.00732: done getting next task for host managed_node2 8218 1726776683.00735: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8218 1726776683.00737: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776683.00744: getting variables 8218 1726776683.00745: in VariableManager get_vars() 8218 1726776683.00773: Calling all_inventory to load vars for managed_node2 8218 1726776683.00775: Calling groups_inventory to load vars for managed_node2 8218 1726776683.00776: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776683.00782: Calling all_plugins_play to load vars for managed_node2 8218 1726776683.00784: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776683.00786: Calling groups_plugins_play to load vars for managed_node2 8218 1726776683.00891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776683.01011: done with get_vars() 8218 1726776683.01021: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:11:23 -0400 (0:00:00.342) 0:01:08.841 **** 8218 1726776683.01094: entering _queue_task() for managed_node2/slurp 8218 1726776683.01254: worker is 1 (out of 1 available) 8218 1726776683.01269: exiting _queue_task() for managed_node2/slurp 8218 1726776683.01280: done queuing things up, now waiting for results queue to drain 8218 1726776683.01282: waiting for pending results... 10912 1726776683.01414: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 10912 1726776683.01525: in run() - task 120fa90a-8a95-cec2-986e-00000000061a 10912 1726776683.01543: variable 'ansible_search_path' from source: unknown 10912 1726776683.01547: variable 'ansible_search_path' from source: unknown 10912 1726776683.01577: calling self._execute() 10912 1726776683.01650: variable 'ansible_host' from source: host vars for 'managed_node2' 10912 1726776683.01664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10912 1726776683.01673: variable 'omit' from source: magic vars 10912 1726776683.01770: variable 'omit' from source: magic vars 10912 1726776683.01817: variable 'omit' from source: magic vars 10912 1726776683.01845: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10912 1726776683.02133: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10912 1726776683.02213: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10912 1726776683.02299: variable 'omit' from source: magic vars 10912 1726776683.02339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10912 1726776683.02375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10912 1726776683.02395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10912 1726776683.02412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10912 1726776683.02425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10912 1726776683.02456: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10912 1726776683.02464: variable 'ansible_host' from source: host vars for 'managed_node2' 10912 1726776683.02470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10912 1726776683.02544: Set connection var ansible_connection to ssh 10912 1726776683.02553: Set connection var ansible_pipelining to False 10912 1726776683.02561: Set connection var ansible_timeout to 10 10912 1726776683.02566: Set connection var ansible_module_compression to ZIP_DEFLATED 10912 1726776683.02569: Set connection var ansible_shell_type to sh 10912 1726776683.02572: Set connection var ansible_shell_executable to /bin/sh 10912 1726776683.02586: variable 'ansible_shell_executable' from source: unknown 10912 1726776683.02588: variable 'ansible_connection' from source: unknown 10912 1726776683.02592: variable 'ansible_module_compression' from source: unknown 10912 1726776683.02595: variable 'ansible_shell_type' from source: unknown 10912 1726776683.02597: variable 'ansible_shell_executable' from source: unknown 10912 1726776683.02599: variable 'ansible_host' from source: host vars for 'managed_node2' 10912 1726776683.02601: variable 'ansible_pipelining' from source: unknown 10912 1726776683.02603: variable 'ansible_timeout' from source: unknown 10912 1726776683.02605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10912 1726776683.02741: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10912 1726776683.02753: variable 'omit' from source: magic vars 10912 1726776683.02762: starting attempt loop 10912 1726776683.02766: running the handler 10912 1726776683.02777: _low_level_execute_command(): starting 10912 1726776683.02784: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10912 1726776683.05417: stdout chunk (state=2): >>>/root <<< 10912 1726776683.05426: stderr chunk (state=2): >>><<< 10912 1726776683.05440: stdout chunk (state=3): >>><<< 10912 1726776683.05454: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10912 1726776683.05472: _low_level_execute_command(): starting 10912 1726776683.05478: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880 `" && echo ansible-tmp-1726776683.054657-10912-232108966596880="` echo /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880 `" ) && sleep 0' 10912 1726776683.08804: stdout chunk (state=2): >>>ansible-tmp-1726776683.054657-10912-232108966596880=/root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880 <<< 10912 1726776683.09298: stderr chunk (state=3): >>><<< 10912 1726776683.09307: stdout chunk (state=3): >>><<< 10912 1726776683.09325: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776683.054657-10912-232108966596880=/root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880 , stderr= 10912 1726776683.09373: variable 'ansible_module_compression' from source: unknown 10912 1726776683.09421: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10912 1726776683.09458: variable 'ansible_facts' from source: unknown 10912 1726776683.09555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880/AnsiballZ_slurp.py 10912 1726776683.10111: Sending initial data 10912 1726776683.10118: Sent initial data (152 bytes) 10912 1726776683.12681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmps8oa_lct /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880/AnsiballZ_slurp.py <<< 10912 1726776683.14025: stderr chunk (state=3): >>><<< 10912 1726776683.14037: stdout chunk (state=3): >>><<< 10912 1726776683.14060: done transferring module to remote 10912 1726776683.14073: _low_level_execute_command(): starting 10912 1726776683.14081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880/ /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880/AnsiballZ_slurp.py && sleep 0' 10912 1726776683.16676: stderr chunk (state=2): >>><<< 10912 1726776683.16687: stdout chunk (state=2): >>><<< 10912 1726776683.16702: _low_level_execute_command() done: rc=0, stdout=, stderr= 10912 1726776683.16706: _low_level_execute_command(): starting 10912 1726776683.16711: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880/AnsiballZ_slurp.py && sleep 0' 10912 1726776683.31638: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 10912 1726776683.32686: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10912 1726776683.32733: stderr chunk (state=3): >>><<< 10912 1726776683.32740: stdout chunk (state=3): >>><<< 10912 1726776683.32754: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.12.75 closed. 10912 1726776683.32778: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10912 1726776683.32790: _low_level_execute_command(): starting 10912 1726776683.32796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776683.054657-10912-232108966596880/ > /dev/null 2>&1 && sleep 0' 10912 1726776683.35213: stderr chunk (state=2): >>><<< 10912 1726776683.35222: stdout chunk (state=2): >>><<< 10912 1726776683.35237: _low_level_execute_command() done: rc=0, stdout=, stderr= 10912 1726776683.35244: handler run complete 10912 1726776683.35257: attempt loop complete, returning result 10912 1726776683.35261: _execute() done 10912 1726776683.35264: dumping result to json 10912 1726776683.35269: done dumping result, returning 10912 1726776683.35276: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-cec2-986e-00000000061a] 10912 1726776683.35283: sending task result for task 120fa90a-8a95-cec2-986e-00000000061a 10912 1726776683.35315: done sending task result for task 120fa90a-8a95-cec2-986e-00000000061a 10912 1726776683.35319: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8218 1726776683.35467: no more pending results, returning what we have 8218 1726776683.35471: results queue empty 8218 1726776683.35472: checking for any_errors_fatal 8218 1726776683.35481: done checking for any_errors_fatal 8218 1726776683.35482: checking for max_fail_percentage 8218 1726776683.35483: done checking for max_fail_percentage 8218 1726776683.35484: checking to see if all hosts have failed and the running result is not ok 8218 1726776683.35485: done checking to see if all hosts have failed 8218 1726776683.35485: getting the remaining hosts for this loop 8218 1726776683.35486: done getting the remaining hosts for this loop 8218 1726776683.35489: getting the next task for host managed_node2 8218 1726776683.35495: done getting next task for host managed_node2 8218 1726776683.35498: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8218 1726776683.35500: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776683.35511: getting variables 8218 1726776683.35512: in VariableManager get_vars() 8218 1726776683.35547: Calling all_inventory to load vars for managed_node2 8218 1726776683.35549: Calling groups_inventory to load vars for managed_node2 8218 1726776683.35551: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776683.35561: Calling all_plugins_play to load vars for managed_node2 8218 1726776683.35564: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776683.35566: Calling groups_plugins_play to load vars for managed_node2 8218 1726776683.35673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776683.35795: done with get_vars() 8218 1726776683.35803: done getting variables 8218 1726776683.35850: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:11:23 -0400 (0:00:00.347) 0:01:09.189 **** 8218 1726776683.35876: entering _queue_task() for managed_node2/set_fact 8218 1726776683.36038: worker is 1 (out of 1 available) 8218 1726776683.36051: exiting _queue_task() for managed_node2/set_fact 8218 1726776683.36065: done queuing things up, now waiting for results queue to drain 8218 1726776683.36067: waiting for pending results... 10932 1726776683.36193: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 10932 1726776683.36307: in run() - task 120fa90a-8a95-cec2-986e-00000000061b 10932 1726776683.36325: variable 'ansible_search_path' from source: unknown 10932 1726776683.36330: variable 'ansible_search_path' from source: unknown 10932 1726776683.36357: calling self._execute() 10932 1726776683.36501: variable 'ansible_host' from source: host vars for 'managed_node2' 10932 1726776683.36510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10932 1726776683.36518: variable 'omit' from source: magic vars 10932 1726776683.36599: variable 'omit' from source: magic vars 10932 1726776683.36637: variable 'omit' from source: magic vars 10932 1726776683.36909: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10932 1726776683.36919: variable '__cur_profile' from source: task vars 10932 1726776683.37023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10932 1726776683.38499: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10932 1726776683.38742: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10932 1726776683.38772: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10932 1726776683.38799: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10932 1726776683.38820: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10932 1726776683.38877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10932 1726776683.38899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10932 1726776683.38917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10932 1726776683.38945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10932 1726776683.38956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10932 1726776683.39033: variable '__kernel_settings_tuned_current_profile' from source: set_fact 10932 1726776683.39072: variable 'omit' from source: magic vars 10932 1726776683.39094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10932 1726776683.39116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10932 1726776683.39133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10932 1726776683.39147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10932 1726776683.39157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10932 1726776683.39179: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10932 1726776683.39185: variable 'ansible_host' from source: host vars for 'managed_node2' 10932 1726776683.39189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10932 1726776683.39262: Set connection var ansible_connection to ssh 10932 1726776683.39270: Set connection var ansible_pipelining to False 10932 1726776683.39277: Set connection var ansible_timeout to 10 10932 1726776683.39284: Set connection var ansible_module_compression to ZIP_DEFLATED 10932 1726776683.39289: Set connection var ansible_shell_type to sh 10932 1726776683.39294: Set connection var ansible_shell_executable to /bin/sh 10932 1726776683.39311: variable 'ansible_shell_executable' from source: unknown 10932 1726776683.39315: variable 'ansible_connection' from source: unknown 10932 1726776683.39318: variable 'ansible_module_compression' from source: unknown 10932 1726776683.39321: variable 'ansible_shell_type' from source: unknown 10932 1726776683.39324: variable 'ansible_shell_executable' from source: unknown 10932 1726776683.39328: variable 'ansible_host' from source: host vars for 'managed_node2' 10932 1726776683.39332: variable 'ansible_pipelining' from source: unknown 10932 1726776683.39334: variable 'ansible_timeout' from source: unknown 10932 1726776683.39337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10932 1726776683.39394: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10932 1726776683.39404: variable 'omit' from source: magic vars 10932 1726776683.39408: starting attempt loop 10932 1726776683.39411: running the handler 10932 1726776683.39419: handler run complete 10932 1726776683.39425: attempt loop complete, returning result 10932 1726776683.39427: _execute() done 10932 1726776683.39431: dumping result to json 10932 1726776683.39434: done dumping result, returning 10932 1726776683.39440: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-cec2-986e-00000000061b] 10932 1726776683.39447: sending task result for task 120fa90a-8a95-cec2-986e-00000000061b 10932 1726776683.39470: done sending task result for task 120fa90a-8a95-cec2-986e-00000000061b 10932 1726776683.39474: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8218 1726776683.39715: no more pending results, returning what we have 8218 1726776683.39718: results queue empty 8218 1726776683.39719: checking for any_errors_fatal 8218 1726776683.39723: done checking for any_errors_fatal 8218 1726776683.39724: checking for max_fail_percentage 8218 1726776683.39725: done checking for max_fail_percentage 8218 1726776683.39725: checking to see if all hosts have failed and the running result is not ok 8218 1726776683.39726: done checking to see if all hosts have failed 8218 1726776683.39727: getting the remaining hosts for this loop 8218 1726776683.39728: done getting the remaining hosts for this loop 8218 1726776683.39732: getting the next task for host managed_node2 8218 1726776683.39737: done getting next task for host managed_node2 8218 1726776683.39739: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8218 1726776683.39740: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776683.39752: getting variables 8218 1726776683.39753: in VariableManager get_vars() 8218 1726776683.39775: Calling all_inventory to load vars for managed_node2 8218 1726776683.39777: Calling groups_inventory to load vars for managed_node2 8218 1726776683.39778: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776683.39784: Calling all_plugins_play to load vars for managed_node2 8218 1726776683.39786: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776683.39787: Calling groups_plugins_play to load vars for managed_node2 8218 1726776683.39889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776683.40009: done with get_vars() 8218 1726776683.40016: done getting variables 8218 1726776683.40062: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:11:23 -0400 (0:00:00.042) 0:01:09.231 **** 8218 1726776683.40084: entering _queue_task() for managed_node2/copy 8218 1726776683.40241: worker is 1 (out of 1 available) 8218 1726776683.40255: exiting _queue_task() for managed_node2/copy 8218 1726776683.40268: done queuing things up, now waiting for results queue to drain 8218 1726776683.40271: waiting for pending results... 10933 1726776683.40395: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 10933 1726776683.40508: in run() - task 120fa90a-8a95-cec2-986e-00000000061c 10933 1726776683.40524: variable 'ansible_search_path' from source: unknown 10933 1726776683.40530: variable 'ansible_search_path' from source: unknown 10933 1726776683.40557: calling self._execute() 10933 1726776683.40624: variable 'ansible_host' from source: host vars for 'managed_node2' 10933 1726776683.40634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10933 1726776683.40643: variable 'omit' from source: magic vars 10933 1726776683.40716: variable 'omit' from source: magic vars 10933 1726776683.40750: variable 'omit' from source: magic vars 10933 1726776683.40772: variable '__kernel_settings_active_profile' from source: set_fact 10933 1726776683.40976: variable '__kernel_settings_active_profile' from source: set_fact 10933 1726776683.40998: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10933 1726776683.41052: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10933 1726776683.41103: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10933 1726776683.41132: variable 'omit' from source: magic vars 10933 1726776683.41172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10933 1726776683.41196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10933 1726776683.41213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10933 1726776683.41226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10933 1726776683.41238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10933 1726776683.41264: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10933 1726776683.41269: variable 'ansible_host' from source: host vars for 'managed_node2' 10933 1726776683.41274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10933 1726776683.41337: Set connection var ansible_connection to ssh 10933 1726776683.41345: Set connection var ansible_pipelining to False 10933 1726776683.41351: Set connection var ansible_timeout to 10 10933 1726776683.41361: Set connection var ansible_module_compression to ZIP_DEFLATED 10933 1726776683.41367: Set connection var ansible_shell_type to sh 10933 1726776683.41372: Set connection var ansible_shell_executable to /bin/sh 10933 1726776683.41388: variable 'ansible_shell_executable' from source: unknown 10933 1726776683.41392: variable 'ansible_connection' from source: unknown 10933 1726776683.41395: variable 'ansible_module_compression' from source: unknown 10933 1726776683.41398: variable 'ansible_shell_type' from source: unknown 10933 1726776683.41401: variable 'ansible_shell_executable' from source: unknown 10933 1726776683.41405: variable 'ansible_host' from source: host vars for 'managed_node2' 10933 1726776683.41409: variable 'ansible_pipelining' from source: unknown 10933 1726776683.41412: variable 'ansible_timeout' from source: unknown 10933 1726776683.41417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10933 1726776683.41507: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10933 1726776683.41519: variable 'omit' from source: magic vars 10933 1726776683.41525: starting attempt loop 10933 1726776683.41530: running the handler 10933 1726776683.41542: _low_level_execute_command(): starting 10933 1726776683.41550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10933 1726776683.43885: stdout chunk (state=2): >>>/root <<< 10933 1726776683.44008: stderr chunk (state=3): >>><<< 10933 1726776683.44014: stdout chunk (state=3): >>><<< 10933 1726776683.44033: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10933 1726776683.44045: _low_level_execute_command(): starting 10933 1726776683.44051: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332 `" && echo ansible-tmp-1726776683.4404104-10933-169930017704332="` echo /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332 `" ) && sleep 0' 10933 1726776683.46948: stdout chunk (state=2): >>>ansible-tmp-1726776683.4404104-10933-169930017704332=/root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332 <<< 10933 1726776683.47107: stderr chunk (state=3): >>><<< 10933 1726776683.47114: stdout chunk (state=3): >>><<< 10933 1726776683.47133: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776683.4404104-10933-169930017704332=/root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332 , stderr= 10933 1726776683.47216: variable 'ansible_module_compression' from source: unknown 10933 1726776683.47269: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10933 1726776683.47300: variable 'ansible_facts' from source: unknown 10933 1726776683.47391: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_stat.py 10933 1726776683.47512: Sending initial data 10933 1726776683.47519: Sent initial data (152 bytes) 10933 1726776683.50372: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp75wdw4dd /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_stat.py <<< 10933 1726776683.51462: stderr chunk (state=3): >>><<< 10933 1726776683.51470: stdout chunk (state=3): >>><<< 10933 1726776683.51490: done transferring module to remote 10933 1726776683.51501: _low_level_execute_command(): starting 10933 1726776683.51507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/ /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_stat.py && sleep 0' 10933 1726776683.53932: stderr chunk (state=2): >>><<< 10933 1726776683.53942: stdout chunk (state=2): >>><<< 10933 1726776683.53956: _low_level_execute_command() done: rc=0, stdout=, stderr= 10933 1726776683.53962: _low_level_execute_command(): starting 10933 1726776683.53967: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_stat.py && sleep 0' 10933 1726776683.70661: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776683.314522, "mtime": 1726776675.7764935, "ctime": 1726776675.7764935, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10933 1726776683.71877: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10933 1726776683.71926: stderr chunk (state=3): >>><<< 10933 1726776683.71935: stdout chunk (state=3): >>><<< 10933 1726776683.71951: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776683.314522, "mtime": 1726776675.7764935, "ctime": 1726776675.7764935, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 10933 1726776683.71993: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10933 1726776683.72035: variable 'ansible_module_compression' from source: unknown 10933 1726776683.72067: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10933 1726776683.72084: variable 'ansible_facts' from source: unknown 10933 1726776683.72146: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_file.py 10933 1726776683.72237: Sending initial data 10933 1726776683.72244: Sent initial data (152 bytes) 10933 1726776683.75005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmperdtw2wn /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_file.py <<< 10933 1726776683.76157: stderr chunk (state=3): >>><<< 10933 1726776683.76164: stdout chunk (state=3): >>><<< 10933 1726776683.76181: done transferring module to remote 10933 1726776683.76190: _low_level_execute_command(): starting 10933 1726776683.76195: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/ /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_file.py && sleep 0' 10933 1726776683.78567: stderr chunk (state=2): >>><<< 10933 1726776683.78575: stdout chunk (state=2): >>><<< 10933 1726776683.78588: _low_level_execute_command() done: rc=0, stdout=, stderr= 10933 1726776683.78592: _low_level_execute_command(): starting 10933 1726776683.78597: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/AnsiballZ_file.py && sleep 0' 10933 1726776683.94845: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpv2j9021v", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10933 1726776683.95950: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10933 1726776683.95994: stderr chunk (state=3): >>><<< 10933 1726776683.96000: stdout chunk (state=3): >>><<< 10933 1726776683.96015: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpv2j9021v", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10933 1726776683.96044: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpv2j9021v', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10933 1726776683.96056: _low_level_execute_command(): starting 10933 1726776683.96064: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776683.4404104-10933-169930017704332/ > /dev/null 2>&1 && sleep 0' 10933 1726776683.98427: stderr chunk (state=2): >>><<< 10933 1726776683.98438: stdout chunk (state=2): >>><<< 10933 1726776683.98453: _low_level_execute_command() done: rc=0, stdout=, stderr= 10933 1726776683.98463: handler run complete 10933 1726776683.98486: attempt loop complete, returning result 10933 1726776683.98491: _execute() done 10933 1726776683.98494: dumping result to json 10933 1726776683.98499: done dumping result, returning 10933 1726776683.98506: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-cec2-986e-00000000061c] 10933 1726776683.98513: sending task result for task 120fa90a-8a95-cec2-986e-00000000061c 10933 1726776683.98547: done sending task result for task 120fa90a-8a95-cec2-986e-00000000061c 10933 1726776683.98551: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8218 1726776683.98712: no more pending results, returning what we have 8218 1726776683.98715: results queue empty 8218 1726776683.98716: checking for any_errors_fatal 8218 1726776683.98722: done checking for any_errors_fatal 8218 1726776683.98723: checking for max_fail_percentage 8218 1726776683.98724: done checking for max_fail_percentage 8218 1726776683.98725: checking to see if all hosts have failed and the running result is not ok 8218 1726776683.98725: done checking to see if all hosts have failed 8218 1726776683.98726: getting the remaining hosts for this loop 8218 1726776683.98727: done getting the remaining hosts for this loop 8218 1726776683.98731: getting the next task for host managed_node2 8218 1726776683.98737: done getting next task for host managed_node2 8218 1726776683.98740: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8218 1726776683.98742: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776683.98752: getting variables 8218 1726776683.98753: in VariableManager get_vars() 8218 1726776683.98786: Calling all_inventory to load vars for managed_node2 8218 1726776683.98788: Calling groups_inventory to load vars for managed_node2 8218 1726776683.98790: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776683.98798: Calling all_plugins_play to load vars for managed_node2 8218 1726776683.98801: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776683.98803: Calling groups_plugins_play to load vars for managed_node2 8218 1726776683.98910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776683.99059: done with get_vars() 8218 1726776683.99067: done getting variables 8218 1726776683.99110: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:11:23 -0400 (0:00:00.590) 0:01:09.821 **** 8218 1726776683.99134: entering _queue_task() for managed_node2/copy 8218 1726776683.99298: worker is 1 (out of 1 available) 8218 1726776683.99313: exiting _queue_task() for managed_node2/copy 8218 1726776683.99324: done queuing things up, now waiting for results queue to drain 8218 1726776683.99325: waiting for pending results... 10964 1726776683.99463: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10964 1726776683.99582: in run() - task 120fa90a-8a95-cec2-986e-00000000061d 10964 1726776683.99598: variable 'ansible_search_path' from source: unknown 10964 1726776683.99602: variable 'ansible_search_path' from source: unknown 10964 1726776683.99631: calling self._execute() 10964 1726776683.99698: variable 'ansible_host' from source: host vars for 'managed_node2' 10964 1726776683.99707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10964 1726776683.99717: variable 'omit' from source: magic vars 10964 1726776683.99796: variable 'omit' from source: magic vars 10964 1726776683.99830: variable 'omit' from source: magic vars 10964 1726776683.99851: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10964 1726776684.00070: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10964 1726776684.00134: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10964 1726776684.00166: variable 'omit' from source: magic vars 10964 1726776684.00198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10964 1726776684.00226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10964 1726776684.00245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10964 1726776684.00261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10964 1726776684.00273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10964 1726776684.00295: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10964 1726776684.00301: variable 'ansible_host' from source: host vars for 'managed_node2' 10964 1726776684.00305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10964 1726776684.00376: Set connection var ansible_connection to ssh 10964 1726776684.00384: Set connection var ansible_pipelining to False 10964 1726776684.00390: Set connection var ansible_timeout to 10 10964 1726776684.00397: Set connection var ansible_module_compression to ZIP_DEFLATED 10964 1726776684.00402: Set connection var ansible_shell_type to sh 10964 1726776684.00408: Set connection var ansible_shell_executable to /bin/sh 10964 1726776684.00424: variable 'ansible_shell_executable' from source: unknown 10964 1726776684.00428: variable 'ansible_connection' from source: unknown 10964 1726776684.00433: variable 'ansible_module_compression' from source: unknown 10964 1726776684.00436: variable 'ansible_shell_type' from source: unknown 10964 1726776684.00440: variable 'ansible_shell_executable' from source: unknown 10964 1726776684.00443: variable 'ansible_host' from source: host vars for 'managed_node2' 10964 1726776684.00447: variable 'ansible_pipelining' from source: unknown 10964 1726776684.00450: variable 'ansible_timeout' from source: unknown 10964 1726776684.00454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10964 1726776684.00548: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 10964 1726776684.00565: variable 'omit' from source: magic vars 10964 1726776684.00572: starting attempt loop 10964 1726776684.00575: running the handler 10964 1726776684.00586: _low_level_execute_command(): starting 10964 1726776684.00593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10964 1726776684.02984: stdout chunk (state=2): >>>/root <<< 10964 1726776684.03107: stderr chunk (state=3): >>><<< 10964 1726776684.03114: stdout chunk (state=3): >>><<< 10964 1726776684.03133: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10964 1726776684.03146: _low_level_execute_command(): starting 10964 1726776684.03151: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437 `" && echo ansible-tmp-1726776684.0314088-10964-129938552031437="` echo /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437 `" ) && sleep 0' 10964 1726776684.05756: stdout chunk (state=2): >>>ansible-tmp-1726776684.0314088-10964-129938552031437=/root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437 <<< 10964 1726776684.05883: stderr chunk (state=3): >>><<< 10964 1726776684.05889: stdout chunk (state=3): >>><<< 10964 1726776684.05903: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776684.0314088-10964-129938552031437=/root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437 , stderr= 10964 1726776684.05972: variable 'ansible_module_compression' from source: unknown 10964 1726776684.06015: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10964 1726776684.06044: variable 'ansible_facts' from source: unknown 10964 1726776684.06109: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_stat.py 10964 1726776684.06192: Sending initial data 10964 1726776684.06199: Sent initial data (152 bytes) 10964 1726776684.08727: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpq5_2hv3w /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_stat.py <<< 10964 1726776684.09985: stderr chunk (state=3): >>><<< 10964 1726776684.09993: stdout chunk (state=3): >>><<< 10964 1726776684.10013: done transferring module to remote 10964 1726776684.10023: _low_level_execute_command(): starting 10964 1726776684.10027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/ /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_stat.py && sleep 0' 10964 1726776684.12593: stderr chunk (state=2): >>><<< 10964 1726776684.12602: stdout chunk (state=2): >>><<< 10964 1726776684.12619: _low_level_execute_command() done: rc=0, stdout=, stderr= 10964 1726776684.12623: _low_level_execute_command(): starting 10964 1726776684.12630: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_stat.py && sleep 0' 10964 1726776684.28973: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776673.9694865, "mtime": 1726776675.7774935, "ctime": 1726776675.7774935, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10964 1726776684.30114: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10964 1726776684.30160: stderr chunk (state=3): >>><<< 10964 1726776684.30168: stdout chunk (state=3): >>><<< 10964 1726776684.30182: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776673.9694865, "mtime": 1726776675.7774935, "ctime": 1726776675.7774935, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 10964 1726776684.30226: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10964 1726776684.30268: variable 'ansible_module_compression' from source: unknown 10964 1726776684.30298: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10964 1726776684.30315: variable 'ansible_facts' from source: unknown 10964 1726776684.30376: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_file.py 10964 1726776684.30461: Sending initial data 10964 1726776684.30468: Sent initial data (152 bytes) 10964 1726776684.32982: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp4tlrv4_u /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_file.py <<< 10964 1726776684.34097: stderr chunk (state=3): >>><<< 10964 1726776684.34104: stdout chunk (state=3): >>><<< 10964 1726776684.34122: done transferring module to remote 10964 1726776684.34133: _low_level_execute_command(): starting 10964 1726776684.34138: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/ /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_file.py && sleep 0' 10964 1726776684.36517: stderr chunk (state=2): >>><<< 10964 1726776684.36527: stdout chunk (state=2): >>><<< 10964 1726776684.36544: _low_level_execute_command() done: rc=0, stdout=, stderr= 10964 1726776684.36549: _low_level_execute_command(): starting 10964 1726776684.36554: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/AnsiballZ_file.py && sleep 0' 10964 1726776684.52787: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp6acp86um", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10964 1726776684.54026: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10964 1726776684.54079: stderr chunk (state=3): >>><<< 10964 1726776684.54087: stdout chunk (state=3): >>><<< 10964 1726776684.54102: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp6acp86um", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 10964 1726776684.54135: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmp6acp86um', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10964 1726776684.54147: _low_level_execute_command(): starting 10964 1726776684.54159: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776684.0314088-10964-129938552031437/ > /dev/null 2>&1 && sleep 0' 10964 1726776684.56768: stderr chunk (state=2): >>><<< 10964 1726776684.56783: stdout chunk (state=2): >>><<< 10964 1726776684.56806: _low_level_execute_command() done: rc=0, stdout=, stderr= 10964 1726776684.56816: handler run complete 10964 1726776684.56847: attempt loop complete, returning result 10964 1726776684.56852: _execute() done 10964 1726776684.56855: dumping result to json 10964 1726776684.56861: done dumping result, returning 10964 1726776684.56870: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-cec2-986e-00000000061d] 10964 1726776684.56878: sending task result for task 120fa90a-8a95-cec2-986e-00000000061d 10964 1726776684.56920: done sending task result for task 120fa90a-8a95-cec2-986e-00000000061d 10964 1726776684.56924: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8218 1726776684.57289: no more pending results, returning what we have 8218 1726776684.57293: results queue empty 8218 1726776684.57294: checking for any_errors_fatal 8218 1726776684.57302: done checking for any_errors_fatal 8218 1726776684.57302: checking for max_fail_percentage 8218 1726776684.57304: done checking for max_fail_percentage 8218 1726776684.57305: checking to see if all hosts have failed and the running result is not ok 8218 1726776684.57306: done checking to see if all hosts have failed 8218 1726776684.57306: getting the remaining hosts for this loop 8218 1726776684.57309: done getting the remaining hosts for this loop 8218 1726776684.57312: getting the next task for host managed_node2 8218 1726776684.57318: done getting next task for host managed_node2 8218 1726776684.57321: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8218 1726776684.57323: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776684.57336: getting variables 8218 1726776684.57337: in VariableManager get_vars() 8218 1726776684.57370: Calling all_inventory to load vars for managed_node2 8218 1726776684.57372: Calling groups_inventory to load vars for managed_node2 8218 1726776684.57374: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776684.57382: Calling all_plugins_play to load vars for managed_node2 8218 1726776684.57385: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776684.57387: Calling groups_plugins_play to load vars for managed_node2 8218 1726776684.57554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776684.57702: done with get_vars() 8218 1726776684.57711: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:11:24 -0400 (0:00:00.586) 0:01:10.408 **** 8218 1726776684.57781: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776684.57956: worker is 1 (out of 1 available) 8218 1726776684.57970: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776684.57982: done queuing things up, now waiting for results queue to drain 8218 1726776684.57984: waiting for pending results... 10991 1726776684.58116: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10991 1726776684.58237: in run() - task 120fa90a-8a95-cec2-986e-00000000061e 10991 1726776684.58254: variable 'ansible_search_path' from source: unknown 10991 1726776684.58260: variable 'ansible_search_path' from source: unknown 10991 1726776684.58288: calling self._execute() 10991 1726776684.58352: variable 'ansible_host' from source: host vars for 'managed_node2' 10991 1726776684.58364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10991 1726776684.58371: variable 'omit' from source: magic vars 10991 1726776684.58449: variable 'omit' from source: magic vars 10991 1726776684.58485: variable 'omit' from source: magic vars 10991 1726776684.58505: variable '__kernel_settings_profile_filename' from source: role '' all vars 10991 1726776684.58722: variable '__kernel_settings_profile_filename' from source: role '' all vars 10991 1726776684.58785: variable '__kernel_settings_profile_dir' from source: role '' all vars 10991 1726776684.58848: variable '__kernel_settings_profile_parent' from source: set_fact 10991 1726776684.58856: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10991 1726776684.58946: variable 'omit' from source: magic vars 10991 1726776684.58981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10991 1726776684.59006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10991 1726776684.59025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10991 1726776684.59039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10991 1726776684.59048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10991 1726776684.59070: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10991 1726776684.59074: variable 'ansible_host' from source: host vars for 'managed_node2' 10991 1726776684.59078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10991 1726776684.59143: Set connection var ansible_connection to ssh 10991 1726776684.59149: Set connection var ansible_pipelining to False 10991 1726776684.59153: Set connection var ansible_timeout to 10 10991 1726776684.59160: Set connection var ansible_module_compression to ZIP_DEFLATED 10991 1726776684.59163: Set connection var ansible_shell_type to sh 10991 1726776684.59166: Set connection var ansible_shell_executable to /bin/sh 10991 1726776684.59181: variable 'ansible_shell_executable' from source: unknown 10991 1726776684.59184: variable 'ansible_connection' from source: unknown 10991 1726776684.59187: variable 'ansible_module_compression' from source: unknown 10991 1726776684.59188: variable 'ansible_shell_type' from source: unknown 10991 1726776684.59190: variable 'ansible_shell_executable' from source: unknown 10991 1726776684.59192: variable 'ansible_host' from source: host vars for 'managed_node2' 10991 1726776684.59194: variable 'ansible_pipelining' from source: unknown 10991 1726776684.59195: variable 'ansible_timeout' from source: unknown 10991 1726776684.59198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10991 1726776684.59318: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 10991 1726776684.59328: variable 'omit' from source: magic vars 10991 1726776684.59334: starting attempt loop 10991 1726776684.59336: running the handler 10991 1726776684.59346: _low_level_execute_command(): starting 10991 1726776684.59352: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10991 1726776684.61696: stdout chunk (state=2): >>>/root <<< 10991 1726776684.61814: stderr chunk (state=3): >>><<< 10991 1726776684.61820: stdout chunk (state=3): >>><<< 10991 1726776684.61841: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10991 1726776684.61853: _low_level_execute_command(): starting 10991 1726776684.61860: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690 `" && echo ansible-tmp-1726776684.6184835-10991-92984397934690="` echo /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690 `" ) && sleep 0' 10991 1726776684.64709: stdout chunk (state=2): >>>ansible-tmp-1726776684.6184835-10991-92984397934690=/root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690 <<< 10991 1726776684.64837: stderr chunk (state=3): >>><<< 10991 1726776684.64843: stdout chunk (state=3): >>><<< 10991 1726776684.64859: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776684.6184835-10991-92984397934690=/root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690 , stderr= 10991 1726776684.64893: variable 'ansible_module_compression' from source: unknown 10991 1726776684.64922: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10991 1726776684.64950: variable 'ansible_facts' from source: unknown 10991 1726776684.65016: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690/AnsiballZ_kernel_settings_get_config.py 10991 1726776684.65111: Sending initial data 10991 1726776684.65118: Sent initial data (173 bytes) 10991 1726776684.67602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpjqil6k9j /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690/AnsiballZ_kernel_settings_get_config.py <<< 10991 1726776684.68651: stderr chunk (state=3): >>><<< 10991 1726776684.68658: stdout chunk (state=3): >>><<< 10991 1726776684.68677: done transferring module to remote 10991 1726776684.68688: _low_level_execute_command(): starting 10991 1726776684.68693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690/ /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10991 1726776684.71026: stderr chunk (state=2): >>><<< 10991 1726776684.71035: stdout chunk (state=2): >>><<< 10991 1726776684.71048: _low_level_execute_command() done: rc=0, stdout=, stderr= 10991 1726776684.71052: _low_level_execute_command(): starting 10991 1726776684.71057: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10991 1726776684.86955: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10991 1726776684.88099: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 10991 1726776684.88110: stdout chunk (state=3): >>><<< 10991 1726776684.88121: stderr chunk (state=3): >>><<< 10991 1726776684.88137: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 10991 1726776684.88172: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10991 1726776684.88184: _low_level_execute_command(): starting 10991 1726776684.88189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776684.6184835-10991-92984397934690/ > /dev/null 2>&1 && sleep 0' 10991 1726776684.90812: stderr chunk (state=2): >>><<< 10991 1726776684.90820: stdout chunk (state=2): >>><<< 10991 1726776684.90838: _low_level_execute_command() done: rc=0, stdout=, stderr= 10991 1726776684.90849: handler run complete 10991 1726776684.90874: attempt loop complete, returning result 10991 1726776684.90880: _execute() done 10991 1726776684.90888: dumping result to json 10991 1726776684.90893: done dumping result, returning 10991 1726776684.90901: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-cec2-986e-00000000061e] 10991 1726776684.90908: sending task result for task 120fa90a-8a95-cec2-986e-00000000061e 10991 1726776684.90939: done sending task result for task 120fa90a-8a95-cec2-986e-00000000061e 10991 1726776684.90944: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8218 1726776684.91122: no more pending results, returning what we have 8218 1726776684.91125: results queue empty 8218 1726776684.91126: checking for any_errors_fatal 8218 1726776684.91133: done checking for any_errors_fatal 8218 1726776684.91134: checking for max_fail_percentage 8218 1726776684.91135: done checking for max_fail_percentage 8218 1726776684.91136: checking to see if all hosts have failed and the running result is not ok 8218 1726776684.91137: done checking to see if all hosts have failed 8218 1726776684.91137: getting the remaining hosts for this loop 8218 1726776684.91138: done getting the remaining hosts for this loop 8218 1726776684.91143: getting the next task for host managed_node2 8218 1726776684.91148: done getting next task for host managed_node2 8218 1726776684.91151: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8218 1726776684.91153: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776684.91166: getting variables 8218 1726776684.91167: in VariableManager get_vars() 8218 1726776684.91197: Calling all_inventory to load vars for managed_node2 8218 1726776684.91199: Calling groups_inventory to load vars for managed_node2 8218 1726776684.91200: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776684.91207: Calling all_plugins_play to load vars for managed_node2 8218 1726776684.91208: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776684.91210: Calling groups_plugins_play to load vars for managed_node2 8218 1726776684.91356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776684.91477: done with get_vars() 8218 1726776684.91485: done getting variables 8218 1726776684.91530: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:11:24 -0400 (0:00:00.337) 0:01:10.746 **** 8218 1726776684.91554: entering _queue_task() for managed_node2/template 8218 1726776684.91711: worker is 1 (out of 1 available) 8218 1726776684.91725: exiting _queue_task() for managed_node2/template 8218 1726776684.91738: done queuing things up, now waiting for results queue to drain 8218 1726776684.91740: waiting for pending results... 11010 1726776684.91866: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 11010 1726776684.91977: in run() - task 120fa90a-8a95-cec2-986e-00000000061f 11010 1726776684.91994: variable 'ansible_search_path' from source: unknown 11010 1726776684.91998: variable 'ansible_search_path' from source: unknown 11010 1726776684.92025: calling self._execute() 11010 1726776684.92095: variable 'ansible_host' from source: host vars for 'managed_node2' 11010 1726776684.92104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11010 1726776684.92113: variable 'omit' from source: magic vars 11010 1726776684.92190: variable 'omit' from source: magic vars 11010 1726776684.92225: variable 'omit' from source: magic vars 11010 1726776684.92455: variable '__kernel_settings_profile_src' from source: role '' all vars 11010 1726776684.92464: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11010 1726776684.92520: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11010 1726776684.92541: variable '__kernel_settings_profile_filename' from source: role '' all vars 11010 1726776684.92586: variable '__kernel_settings_profile_filename' from source: role '' all vars 11010 1726776684.92639: variable '__kernel_settings_profile_dir' from source: role '' all vars 11010 1726776684.92697: variable '__kernel_settings_profile_parent' from source: set_fact 11010 1726776684.92704: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11010 1726776684.92730: variable 'omit' from source: magic vars 11010 1726776684.92762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11010 1726776684.92787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11010 1726776684.92804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11010 1726776684.92819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11010 1726776684.92832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11010 1726776684.92855: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11010 1726776684.92861: variable 'ansible_host' from source: host vars for 'managed_node2' 11010 1726776684.92864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11010 1726776684.92924: Set connection var ansible_connection to ssh 11010 1726776684.92933: Set connection var ansible_pipelining to False 11010 1726776684.92938: Set connection var ansible_timeout to 10 11010 1726776684.92943: Set connection var ansible_module_compression to ZIP_DEFLATED 11010 1726776684.92946: Set connection var ansible_shell_type to sh 11010 1726776684.92949: Set connection var ansible_shell_executable to /bin/sh 11010 1726776684.92962: variable 'ansible_shell_executable' from source: unknown 11010 1726776684.92965: variable 'ansible_connection' from source: unknown 11010 1726776684.92968: variable 'ansible_module_compression' from source: unknown 11010 1726776684.92969: variable 'ansible_shell_type' from source: unknown 11010 1726776684.92971: variable 'ansible_shell_executable' from source: unknown 11010 1726776684.92973: variable 'ansible_host' from source: host vars for 'managed_node2' 11010 1726776684.92975: variable 'ansible_pipelining' from source: unknown 11010 1726776684.92976: variable 'ansible_timeout' from source: unknown 11010 1726776684.92978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11010 1726776684.93067: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11010 1726776684.93077: variable 'omit' from source: magic vars 11010 1726776684.93081: starting attempt loop 11010 1726776684.93083: running the handler 11010 1726776684.93092: _low_level_execute_command(): starting 11010 1726776684.93097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11010 1726776684.95432: stdout chunk (state=2): >>>/root <<< 11010 1726776684.95549: stderr chunk (state=3): >>><<< 11010 1726776684.95556: stdout chunk (state=3): >>><<< 11010 1726776684.95572: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11010 1726776684.95584: _low_level_execute_command(): starting 11010 1726776684.95589: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247 `" && echo ansible-tmp-1726776684.9557934-11010-276866442247247="` echo /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247 `" ) && sleep 0' 11010 1726776684.98195: stdout chunk (state=2): >>>ansible-tmp-1726776684.9557934-11010-276866442247247=/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247 <<< 11010 1726776684.98321: stderr chunk (state=3): >>><<< 11010 1726776684.98327: stdout chunk (state=3): >>><<< 11010 1726776684.98342: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776684.9557934-11010-276866442247247=/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247 , stderr= 11010 1726776684.98356: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 11010 1726776684.98374: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 11010 1726776684.98392: variable 'ansible_search_path' from source: unknown 11010 1726776684.98933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11010 1726776685.00580: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11010 1726776685.00626: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11010 1726776685.00656: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11010 1726776685.00685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11010 1726776685.00704: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11010 1726776685.00894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11010 1726776685.00915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11010 1726776685.00932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11010 1726776685.00960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11010 1726776685.00969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11010 1726776685.01200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11010 1726776685.01218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11010 1726776685.01239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11010 1726776685.01267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11010 1726776685.01279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11010 1726776685.01508: variable 'ansible_managed' from source: unknown 11010 1726776685.01515: variable '__sections' from source: task vars 11010 1726776685.01602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11010 1726776685.01619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11010 1726776685.01639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11010 1726776685.01668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11010 1726776685.01680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11010 1726776685.01747: variable 'kernel_settings_sysctl' from source: include params 11010 1726776685.01756: variable '__kernel_settings_state_empty' from source: role '' all vars 11010 1726776685.01764: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11010 1726776685.01801: variable '__sysctl_old' from source: task vars 11010 1726776685.01847: variable '__sysctl_old' from source: task vars 11010 1726776685.01986: variable 'kernel_settings_purge' from source: role '' defaults 11010 1726776685.01993: variable 'kernel_settings_sysctl' from source: include params 11010 1726776685.02000: variable '__kernel_settings_state_empty' from source: role '' all vars 11010 1726776685.02005: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11010 1726776685.02010: variable '__kernel_settings_profile_contents' from source: set_fact 11010 1726776685.02146: variable 'kernel_settings_sysfs' from source: include params 11010 1726776685.02154: variable '__kernel_settings_state_empty' from source: role '' all vars 11010 1726776685.02162: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11010 1726776685.02181: variable '__sysfs_old' from source: task vars 11010 1726776685.02224: variable '__sysfs_old' from source: task vars 11010 1726776685.02362: variable 'kernel_settings_purge' from source: role '' defaults 11010 1726776685.02368: variable 'kernel_settings_sysfs' from source: include params 11010 1726776685.02375: variable '__kernel_settings_state_empty' from source: role '' all vars 11010 1726776685.02380: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11010 1726776685.02384: variable '__kernel_settings_profile_contents' from source: set_fact 11010 1726776685.02423: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 11010 1726776685.02433: variable '__systemd_old' from source: task vars 11010 1726776685.02476: variable '__systemd_old' from source: task vars 11010 1726776685.02604: variable 'kernel_settings_purge' from source: role '' defaults 11010 1726776685.02611: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 11010 1726776685.02617: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.02623: variable '__kernel_settings_profile_contents' from source: set_fact 11010 1726776685.02637: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 11010 1726776685.02642: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 11010 1726776685.02646: variable '__trans_huge_old' from source: task vars 11010 1726776685.02688: variable '__trans_huge_old' from source: task vars 11010 1726776685.02818: variable 'kernel_settings_purge' from source: role '' defaults 11010 1726776685.02825: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 11010 1726776685.02831: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.02837: variable '__kernel_settings_profile_contents' from source: set_fact 11010 1726776685.02848: variable '__trans_defrag_old' from source: task vars 11010 1726776685.02890: variable '__trans_defrag_old' from source: task vars 11010 1726776685.03017: variable 'kernel_settings_purge' from source: role '' defaults 11010 1726776685.03024: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 11010 1726776685.03030: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03036: variable '__kernel_settings_profile_contents' from source: set_fact 11010 1726776685.03050: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03064: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03078: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03086: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03092: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03105: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03112: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03121: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03131: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03138: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03144: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03149: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03155: variable '__kernel_settings_state_absent' from source: role '' all vars 11010 1726776685.03589: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11010 1726776685.03633: variable 'ansible_module_compression' from source: unknown 11010 1726776685.03671: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11010 1726776685.03695: variable 'ansible_facts' from source: unknown 11010 1726776685.03762: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_stat.py 11010 1726776685.03851: Sending initial data 11010 1726776685.03858: Sent initial data (152 bytes) 11010 1726776685.06456: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpx_0nj6fj /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_stat.py <<< 11010 1726776685.07663: stderr chunk (state=3): >>><<< 11010 1726776685.07671: stdout chunk (state=3): >>><<< 11010 1726776685.07693: done transferring module to remote 11010 1726776685.07704: _low_level_execute_command(): starting 11010 1726776685.07711: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/ /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_stat.py && sleep 0' 11010 1726776685.10382: stderr chunk (state=2): >>><<< 11010 1726776685.10394: stdout chunk (state=2): >>><<< 11010 1726776685.10409: _low_level_execute_command() done: rc=0, stdout=, stderr= 11010 1726776685.10413: _low_level_execute_command(): starting 11010 1726776685.10421: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_stat.py && sleep 0' 11010 1726776685.27238: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 351, "inode": 90177794, "dev": 51713, "nlink": 1, "atime": 1726776675.7644935, "mtime": 1726776674.9944904, "ctime": 1726776675.2494915, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "mimetype": "text/plain", "charset": "us-ascii", "version": "2830193438", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11010 1726776685.28378: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11010 1726776685.28428: stderr chunk (state=3): >>><<< 11010 1726776685.28436: stdout chunk (state=3): >>><<< 11010 1726776685.28453: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 351, "inode": 90177794, "dev": 51713, "nlink": 1, "atime": 1726776675.7644935, "mtime": 1726776674.9944904, "ctime": 1726776675.2494915, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "mimetype": "text/plain", "charset": "us-ascii", "version": "2830193438", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11010 1726776685.28493: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11010 1726776685.28584: Sending initial data 11010 1726776685.28592: Sent initial data (160 bytes) 11010 1726776685.31168: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpo07ten6g/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source <<< 11010 1726776685.31526: stderr chunk (state=3): >>><<< 11010 1726776685.31535: stdout chunk (state=3): >>><<< 11010 1726776685.31550: _low_level_execute_command(): starting 11010 1726776685.31556: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/ /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source && sleep 0' 11010 1726776685.33901: stderr chunk (state=2): >>><<< 11010 1726776685.33910: stdout chunk (state=2): >>><<< 11010 1726776685.33925: _low_level_execute_command() done: rc=0, stdout=, stderr= 11010 1726776685.33947: variable 'ansible_module_compression' from source: unknown 11010 1726776685.33985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11010 1726776685.34004: variable 'ansible_facts' from source: unknown 11010 1726776685.34062: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_copy.py 11010 1726776685.34153: Sending initial data 11010 1726776685.34163: Sent initial data (152 bytes) 11010 1726776685.36698: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpgzzekit9 /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_copy.py <<< 11010 1726776685.38012: stderr chunk (state=3): >>><<< 11010 1726776685.38021: stdout chunk (state=3): >>><<< 11010 1726776685.38044: done transferring module to remote 11010 1726776685.38054: _low_level_execute_command(): starting 11010 1726776685.38059: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/ /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_copy.py && sleep 0' 11010 1726776685.40808: stderr chunk (state=2): >>><<< 11010 1726776685.40819: stdout chunk (state=2): >>><<< 11010 1726776685.40837: _low_level_execute_command() done: rc=0, stdout=, stderr= 11010 1726776685.40843: _low_level_execute_command(): starting 11010 1726776685.40849: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/AnsiballZ_copy.py && sleep 0' 11010 1726776685.58197: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source", "md5sum": "1fd7f2202613b516022cf613601e26bd", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11010 1726776685.59410: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11010 1726776685.59459: stderr chunk (state=3): >>><<< 11010 1726776685.59466: stdout chunk (state=3): >>><<< 11010 1726776685.59482: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source", "md5sum": "1fd7f2202613b516022cf613601e26bd", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11010 1726776685.59508: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '3107bf46f5c007ef178305bb243dd11664f9bf35', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11010 1726776685.59538: _low_level_execute_command(): starting 11010 1726776685.59547: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/ > /dev/null 2>&1 && sleep 0' 11010 1726776685.61981: stderr chunk (state=2): >>><<< 11010 1726776685.61989: stdout chunk (state=2): >>><<< 11010 1726776685.62004: _low_level_execute_command() done: rc=0, stdout=, stderr= 11010 1726776685.62015: handler run complete 11010 1726776685.62036: attempt loop complete, returning result 11010 1726776685.62040: _execute() done 11010 1726776685.62043: dumping result to json 11010 1726776685.62048: done dumping result, returning 11010 1726776685.62056: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-cec2-986e-00000000061f] 11010 1726776685.62063: sending task result for task 120fa90a-8a95-cec2-986e-00000000061f 11010 1726776685.62107: done sending task result for task 120fa90a-8a95-cec2-986e-00000000061f 11010 1726776685.62112: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "1fd7f2202613b516022cf613601e26bd", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "src": "/root/.ansible/tmp/ansible-tmp-1726776684.9557934-11010-276866442247247/source", "state": "file", "uid": 0 } 8218 1726776685.62317: no more pending results, returning what we have 8218 1726776685.62320: results queue empty 8218 1726776685.62321: checking for any_errors_fatal 8218 1726776685.62327: done checking for any_errors_fatal 8218 1726776685.62328: checking for max_fail_percentage 8218 1726776685.62330: done checking for max_fail_percentage 8218 1726776685.62331: checking to see if all hosts have failed and the running result is not ok 8218 1726776685.62332: done checking to see if all hosts have failed 8218 1726776685.62332: getting the remaining hosts for this loop 8218 1726776685.62333: done getting the remaining hosts for this loop 8218 1726776685.62337: getting the next task for host managed_node2 8218 1726776685.62342: done getting next task for host managed_node2 8218 1726776685.62344: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8218 1726776685.62347: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776685.62356: getting variables 8218 1726776685.62359: in VariableManager get_vars() 8218 1726776685.62392: Calling all_inventory to load vars for managed_node2 8218 1726776685.62394: Calling groups_inventory to load vars for managed_node2 8218 1726776685.62395: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776685.62402: Calling all_plugins_play to load vars for managed_node2 8218 1726776685.62403: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776685.62405: Calling groups_plugins_play to load vars for managed_node2 8218 1726776685.62510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776685.62633: done with get_vars() 8218 1726776685.62642: done getting variables 8218 1726776685.62686: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:11:25 -0400 (0:00:00.711) 0:01:11.457 **** 8218 1726776685.62709: entering _queue_task() for managed_node2/service 8218 1726776685.62878: worker is 1 (out of 1 available) 8218 1726776685.62892: exiting _queue_task() for managed_node2/service 8218 1726776685.62904: done queuing things up, now waiting for results queue to drain 8218 1726776685.62906: waiting for pending results... 11040 1726776685.63029: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 11040 1726776685.63147: in run() - task 120fa90a-8a95-cec2-986e-000000000620 11040 1726776685.63164: variable 'ansible_search_path' from source: unknown 11040 1726776685.63169: variable 'ansible_search_path' from source: unknown 11040 1726776685.63202: variable '__kernel_settings_services' from source: include_vars 11040 1726776685.63435: variable '__kernel_settings_services' from source: include_vars 11040 1726776685.63762: variable 'omit' from source: magic vars 11040 1726776685.63832: variable 'ansible_host' from source: host vars for 'managed_node2' 11040 1726776685.63843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11040 1726776685.63852: variable 'omit' from source: magic vars 11040 1726776685.64023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11040 1726776685.64185: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11040 1726776685.64218: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11040 1726776685.64245: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11040 1726776685.64271: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11040 1726776685.64340: variable '__kernel_settings_register_profile' from source: set_fact 11040 1726776685.64351: variable '__kernel_settings_register_mode' from source: set_fact 11040 1726776685.64367: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 11040 1726776685.64371: when evaluation is False, skipping this task 11040 1726776685.64391: variable 'item' from source: unknown 11040 1726776685.64438: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 11040 1726776685.64466: dumping result to json 11040 1726776685.64472: done dumping result, returning 11040 1726776685.64478: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-cec2-986e-000000000620] 11040 1726776685.64484: sending task result for task 120fa90a-8a95-cec2-986e-000000000620 11040 1726776685.64506: done sending task result for task 120fa90a-8a95-cec2-986e-000000000620 11040 1726776685.64509: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8218 1726776685.64660: no more pending results, returning what we have 8218 1726776685.64663: results queue empty 8218 1726776685.64664: checking for any_errors_fatal 8218 1726776685.64672: done checking for any_errors_fatal 8218 1726776685.64673: checking for max_fail_percentage 8218 1726776685.64674: done checking for max_fail_percentage 8218 1726776685.64674: checking to see if all hosts have failed and the running result is not ok 8218 1726776685.64675: done checking to see if all hosts have failed 8218 1726776685.64676: getting the remaining hosts for this loop 8218 1726776685.64677: done getting the remaining hosts for this loop 8218 1726776685.64679: getting the next task for host managed_node2 8218 1726776685.64684: done getting next task for host managed_node2 8218 1726776685.64687: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8218 1726776685.64689: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776685.64701: getting variables 8218 1726776685.64702: in VariableManager get_vars() 8218 1726776685.64724: Calling all_inventory to load vars for managed_node2 8218 1726776685.64726: Calling groups_inventory to load vars for managed_node2 8218 1726776685.64727: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776685.64735: Calling all_plugins_play to load vars for managed_node2 8218 1726776685.64737: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776685.64738: Calling groups_plugins_play to load vars for managed_node2 8218 1726776685.64839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776685.64956: done with get_vars() 8218 1726776685.64966: done getting variables 8218 1726776685.65005: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:11:25 -0400 (0:00:00.023) 0:01:11.480 **** 8218 1726776685.65027: entering _queue_task() for managed_node2/command 8218 1726776685.65181: worker is 1 (out of 1 available) 8218 1726776685.65194: exiting _queue_task() for managed_node2/command 8218 1726776685.65206: done queuing things up, now waiting for results queue to drain 8218 1726776685.65208: waiting for pending results... 11041 1726776685.65325: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 11041 1726776685.65439: in run() - task 120fa90a-8a95-cec2-986e-000000000621 11041 1726776685.65454: variable 'ansible_search_path' from source: unknown 11041 1726776685.65460: variable 'ansible_search_path' from source: unknown 11041 1726776685.65483: calling self._execute() 11041 1726776685.65543: variable 'ansible_host' from source: host vars for 'managed_node2' 11041 1726776685.65549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11041 1726776685.65555: variable 'omit' from source: magic vars 11041 1726776685.65869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11041 1726776685.66308: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11041 1726776685.66342: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11041 1726776685.66367: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11041 1726776685.66391: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11041 1726776685.66469: variable '__kernel_settings_register_profile' from source: set_fact 11041 1726776685.66490: Evaluated conditional (not __kernel_settings_register_profile is changed): True 11041 1726776685.66576: variable '__kernel_settings_register_mode' from source: set_fact 11041 1726776685.66588: Evaluated conditional (not __kernel_settings_register_mode is changed): True 11041 1726776685.66665: variable '__kernel_settings_register_apply' from source: set_fact 11041 1726776685.66676: Evaluated conditional (__kernel_settings_register_apply is changed): True 11041 1726776685.66682: variable 'omit' from source: magic vars 11041 1726776685.66710: variable 'omit' from source: magic vars 11041 1726776685.66791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11041 1726776685.68168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11041 1726776685.68219: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11041 1726776685.68262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11041 1726776685.68287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11041 1726776685.68307: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11041 1726776685.68358: variable '__kernel_settings_active_profile' from source: set_fact 11041 1726776685.68385: variable 'omit' from source: magic vars 11041 1726776685.68407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11041 1726776685.68428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11041 1726776685.68445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11041 1726776685.68458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11041 1726776685.68468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11041 1726776685.68490: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11041 1726776685.68495: variable 'ansible_host' from source: host vars for 'managed_node2' 11041 1726776685.68498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11041 1726776685.68560: Set connection var ansible_connection to ssh 11041 1726776685.68566: Set connection var ansible_pipelining to False 11041 1726776685.68570: Set connection var ansible_timeout to 10 11041 1726776685.68575: Set connection var ansible_module_compression to ZIP_DEFLATED 11041 1726776685.68579: Set connection var ansible_shell_type to sh 11041 1726776685.68582: Set connection var ansible_shell_executable to /bin/sh 11041 1726776685.68596: variable 'ansible_shell_executable' from source: unknown 11041 1726776685.68598: variable 'ansible_connection' from source: unknown 11041 1726776685.68600: variable 'ansible_module_compression' from source: unknown 11041 1726776685.68602: variable 'ansible_shell_type' from source: unknown 11041 1726776685.68605: variable 'ansible_shell_executable' from source: unknown 11041 1726776685.68606: variable 'ansible_host' from source: host vars for 'managed_node2' 11041 1726776685.68609: variable 'ansible_pipelining' from source: unknown 11041 1726776685.68610: variable 'ansible_timeout' from source: unknown 11041 1726776685.68612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11041 1726776685.68675: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11041 1726776685.68685: variable 'omit' from source: magic vars 11041 1726776685.68689: starting attempt loop 11041 1726776685.68692: running the handler 11041 1726776685.68701: _low_level_execute_command(): starting 11041 1726776685.68705: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11041 1726776685.71053: stdout chunk (state=2): >>>/root <<< 11041 1726776685.71174: stderr chunk (state=3): >>><<< 11041 1726776685.71180: stdout chunk (state=3): >>><<< 11041 1726776685.71197: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11041 1726776685.71207: _low_level_execute_command(): starting 11041 1726776685.71213: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772 `" && echo ansible-tmp-1726776685.7120397-11041-201676634446772="` echo /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772 `" ) && sleep 0' 11041 1726776685.73776: stdout chunk (state=2): >>>ansible-tmp-1726776685.7120397-11041-201676634446772=/root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772 <<< 11041 1726776685.73903: stderr chunk (state=3): >>><<< 11041 1726776685.73911: stdout chunk (state=3): >>><<< 11041 1726776685.73924: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776685.7120397-11041-201676634446772=/root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772 , stderr= 11041 1726776685.73947: variable 'ansible_module_compression' from source: unknown 11041 1726776685.73981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11041 1726776685.74009: variable 'ansible_facts' from source: unknown 11041 1726776685.74082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772/AnsiballZ_command.py 11041 1726776685.74177: Sending initial data 11041 1726776685.74184: Sent initial data (155 bytes) 11041 1726776685.76664: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpsn71nbjj /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772/AnsiballZ_command.py <<< 11041 1726776685.77717: stderr chunk (state=3): >>><<< 11041 1726776685.77723: stdout chunk (state=3): >>><<< 11041 1726776685.77741: done transferring module to remote 11041 1726776685.77751: _low_level_execute_command(): starting 11041 1726776685.77760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772/ /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772/AnsiballZ_command.py && sleep 0' 11041 1726776685.80117: stderr chunk (state=2): >>><<< 11041 1726776685.80128: stdout chunk (state=2): >>><<< 11041 1726776685.80145: _low_level_execute_command() done: rc=0, stdout=, stderr= 11041 1726776685.80149: _low_level_execute_command(): starting 11041 1726776685.80155: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772/AnsiballZ_command.py && sleep 0' 11041 1726776687.09354: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:25.952389", "end": "2024-09-19 16:11:27.091703", "delta": "0:00:01.139314", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11041 1726776687.10567: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11041 1726776687.10615: stderr chunk (state=3): >>><<< 11041 1726776687.10622: stdout chunk (state=3): >>><<< 11041 1726776687.10642: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:25.952389", "end": "2024-09-19 16:11:27.091703", "delta": "0:00:01.139314", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11041 1726776687.10670: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11041 1726776687.10680: _low_level_execute_command(): starting 11041 1726776687.10686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776685.7120397-11041-201676634446772/ > /dev/null 2>&1 && sleep 0' 11041 1726776687.13197: stderr chunk (state=2): >>><<< 11041 1726776687.13204: stdout chunk (state=2): >>><<< 11041 1726776687.13217: _low_level_execute_command() done: rc=0, stdout=, stderr= 11041 1726776687.13224: handler run complete 11041 1726776687.13242: Evaluated conditional (True): True 11041 1726776687.13251: attempt loop complete, returning result 11041 1726776687.13254: _execute() done 11041 1726776687.13260: dumping result to json 11041 1726776687.13266: done dumping result, returning 11041 1726776687.13273: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-cec2-986e-000000000621] 11041 1726776687.13278: sending task result for task 120fa90a-8a95-cec2-986e-000000000621 11041 1726776687.13306: done sending task result for task 120fa90a-8a95-cec2-986e-000000000621 11041 1726776687.13310: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.139314", "end": "2024-09-19 16:11:27.091703", "rc": 0, "start": "2024-09-19 16:11:25.952389" } 8218 1726776687.13482: no more pending results, returning what we have 8218 1726776687.13485: results queue empty 8218 1726776687.13486: checking for any_errors_fatal 8218 1726776687.13494: done checking for any_errors_fatal 8218 1726776687.13495: checking for max_fail_percentage 8218 1726776687.13496: done checking for max_fail_percentage 8218 1726776687.13497: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.13497: done checking to see if all hosts have failed 8218 1726776687.13498: getting the remaining hosts for this loop 8218 1726776687.13499: done getting the remaining hosts for this loop 8218 1726776687.13502: getting the next task for host managed_node2 8218 1726776687.13508: done getting next task for host managed_node2 8218 1726776687.13511: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8218 1726776687.13513: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.13523: getting variables 8218 1726776687.13524: in VariableManager get_vars() 8218 1726776687.13558: Calling all_inventory to load vars for managed_node2 8218 1726776687.13561: Calling groups_inventory to load vars for managed_node2 8218 1726776687.13563: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.13572: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.13574: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.13576: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.13687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.14022: done with get_vars() 8218 1726776687.14037: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:11:27 -0400 (0:00:01.490) 0:01:12.971 **** 8218 1726776687.14104: entering _queue_task() for managed_node2/include_tasks 8218 1726776687.14259: worker is 1 (out of 1 available) 8218 1726776687.14280: exiting _queue_task() for managed_node2/include_tasks 8218 1726776687.14293: done queuing things up, now waiting for results queue to drain 8218 1726776687.14294: waiting for pending results... 11080 1726776687.14422: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11080 1726776687.14542: in run() - task 120fa90a-8a95-cec2-986e-000000000622 11080 1726776687.14560: variable 'ansible_search_path' from source: unknown 11080 1726776687.14564: variable 'ansible_search_path' from source: unknown 11080 1726776687.14592: calling self._execute() 11080 1726776687.14659: variable 'ansible_host' from source: host vars for 'managed_node2' 11080 1726776687.14668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11080 1726776687.14677: variable 'omit' from source: magic vars 11080 1726776687.15002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11080 1726776687.15185: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11080 1726776687.15220: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11080 1726776687.15248: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11080 1726776687.15278: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11080 1726776687.15375: variable '__kernel_settings_register_apply' from source: set_fact 11080 1726776687.15400: Evaluated conditional (__kernel_settings_register_apply is changed): True 11080 1726776687.15406: _execute() done 11080 1726776687.15408: dumping result to json 11080 1726776687.15410: done dumping result, returning 11080 1726776687.15415: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-cec2-986e-000000000622] 11080 1726776687.15420: sending task result for task 120fa90a-8a95-cec2-986e-000000000622 11080 1726776687.15442: done sending task result for task 120fa90a-8a95-cec2-986e-000000000622 11080 1726776687.15444: WORKER PROCESS EXITING 8218 1726776687.15662: no more pending results, returning what we have 8218 1726776687.15665: in VariableManager get_vars() 8218 1726776687.15692: Calling all_inventory to load vars for managed_node2 8218 1726776687.15694: Calling groups_inventory to load vars for managed_node2 8218 1726776687.15695: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.15701: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.15703: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.15704: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.15866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.16103: done with get_vars() 8218 1726776687.16111: variable 'ansible_search_path' from source: unknown 8218 1726776687.16112: variable 'ansible_search_path' from source: unknown 8218 1726776687.16152: we have included files to process 8218 1726776687.16153: generating all_blocks data 8218 1726776687.16155: done generating all_blocks data 8218 1726776687.16161: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776687.16162: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776687.16165: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8218 1726776687.16656: done processing included file 8218 1726776687.16659: iterating over new_blocks loaded from include file 8218 1726776687.16661: in VariableManager get_vars() 8218 1726776687.16687: done with get_vars() 8218 1726776687.16689: filtering new block on tags 8218 1726776687.16754: done filtering new block on tags 8218 1726776687.16772: done iterating over new_blocks loaded from include file 8218 1726776687.16773: extending task lists for all hosts with included blocks 8218 1726776687.17621: done extending task lists 8218 1726776687.17623: done processing included files 8218 1726776687.17624: results queue empty 8218 1726776687.17624: checking for any_errors_fatal 8218 1726776687.17631: done checking for any_errors_fatal 8218 1726776687.17632: checking for max_fail_percentage 8218 1726776687.17633: done checking for max_fail_percentage 8218 1726776687.17634: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.17635: done checking to see if all hosts have failed 8218 1726776687.17635: getting the remaining hosts for this loop 8218 1726776687.17636: done getting the remaining hosts for this loop 8218 1726776687.17639: getting the next task for host managed_node2 8218 1726776687.17643: done getting next task for host managed_node2 8218 1726776687.17646: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8218 1726776687.17649: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.17659: getting variables 8218 1726776687.17660: in VariableManager get_vars() 8218 1726776687.17674: Calling all_inventory to load vars for managed_node2 8218 1726776687.17676: Calling groups_inventory to load vars for managed_node2 8218 1726776687.17678: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.17683: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.17686: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.17688: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.17860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.18106: done with get_vars() 8218 1726776687.18116: done getting variables 8218 1726776687.18156: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.040) 0:01:13.012 **** 8218 1726776687.18189: entering _queue_task() for managed_node2/command 8218 1726776687.18448: worker is 1 (out of 1 available) 8218 1726776687.18458: exiting _queue_task() for managed_node2/command 8218 1726776687.18469: done queuing things up, now waiting for results queue to drain 8218 1726776687.18471: waiting for pending results... 11081 1726776687.18610: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11081 1726776687.18748: in run() - task 120fa90a-8a95-cec2-986e-0000000007f9 11081 1726776687.18766: variable 'ansible_search_path' from source: unknown 11081 1726776687.18770: variable 'ansible_search_path' from source: unknown 11081 1726776687.18798: calling self._execute() 11081 1726776687.18875: variable 'ansible_host' from source: host vars for 'managed_node2' 11081 1726776687.18884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11081 1726776687.18893: variable 'omit' from source: magic vars 11081 1726776687.18974: variable 'omit' from source: magic vars 11081 1726776687.19015: variable 'omit' from source: magic vars 11081 1726776687.19043: variable 'omit' from source: magic vars 11081 1726776687.19087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11081 1726776687.19111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11081 1726776687.19132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11081 1726776687.19145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11081 1726776687.19154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11081 1726776687.19178: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11081 1726776687.19183: variable 'ansible_host' from source: host vars for 'managed_node2' 11081 1726776687.19185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11081 1726776687.19255: Set connection var ansible_connection to ssh 11081 1726776687.19263: Set connection var ansible_pipelining to False 11081 1726776687.19267: Set connection var ansible_timeout to 10 11081 1726776687.19272: Set connection var ansible_module_compression to ZIP_DEFLATED 11081 1726776687.19275: Set connection var ansible_shell_type to sh 11081 1726776687.19278: Set connection var ansible_shell_executable to /bin/sh 11081 1726776687.19295: variable 'ansible_shell_executable' from source: unknown 11081 1726776687.19302: variable 'ansible_connection' from source: unknown 11081 1726776687.19307: variable 'ansible_module_compression' from source: unknown 11081 1726776687.19310: variable 'ansible_shell_type' from source: unknown 11081 1726776687.19315: variable 'ansible_shell_executable' from source: unknown 11081 1726776687.19319: variable 'ansible_host' from source: host vars for 'managed_node2' 11081 1726776687.19322: variable 'ansible_pipelining' from source: unknown 11081 1726776687.19327: variable 'ansible_timeout' from source: unknown 11081 1726776687.19334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11081 1726776687.19426: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11081 1726776687.19442: variable 'omit' from source: magic vars 11081 1726776687.19448: starting attempt loop 11081 1726776687.19452: running the handler 11081 1726776687.19469: _low_level_execute_command(): starting 11081 1726776687.19478: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11081 1726776687.21941: stdout chunk (state=2): >>>/root <<< 11081 1726776687.22036: stderr chunk (state=3): >>><<< 11081 1726776687.22043: stdout chunk (state=3): >>><<< 11081 1726776687.22063: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11081 1726776687.22081: _low_level_execute_command(): starting 11081 1726776687.22087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771 `" && echo ansible-tmp-1726776687.2207417-11081-204455755959771="` echo /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771 `" ) && sleep 0' 11081 1726776687.24864: stdout chunk (state=2): >>>ansible-tmp-1726776687.2207417-11081-204455755959771=/root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771 <<< 11081 1726776687.24925: stderr chunk (state=3): >>><<< 11081 1726776687.24937: stdout chunk (state=3): >>><<< 11081 1726776687.24952: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776687.2207417-11081-204455755959771=/root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771 , stderr= 11081 1726776687.24977: variable 'ansible_module_compression' from source: unknown 11081 1726776687.25022: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11081 1726776687.25056: variable 'ansible_facts' from source: unknown 11081 1726776687.25135: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771/AnsiballZ_command.py 11081 1726776687.25582: Sending initial data 11081 1726776687.25590: Sent initial data (155 bytes) 11081 1726776687.28504: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpzy73w831 /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771/AnsiballZ_command.py <<< 11081 1726776687.30065: stderr chunk (state=3): >>><<< 11081 1726776687.30077: stdout chunk (state=3): >>><<< 11081 1726776687.30101: done transferring module to remote 11081 1726776687.30114: _low_level_execute_command(): starting 11081 1726776687.30120: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771/ /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771/AnsiballZ_command.py && sleep 0' 11081 1726776687.32674: stderr chunk (state=2): >>><<< 11081 1726776687.32685: stdout chunk (state=2): >>><<< 11081 1726776687.32701: _low_level_execute_command() done: rc=0, stdout=, stderr= 11081 1726776687.32706: _low_level_execute_command(): starting 11081 1726776687.32711: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771/AnsiballZ_command.py && sleep 0' 11081 1726776687.59662: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:27.489400", "end": "2024-09-19 16:11:27.594803", "delta": "0:00:00.105403", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11081 1726776687.60853: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11081 1726776687.60901: stderr chunk (state=3): >>><<< 11081 1726776687.60908: stdout chunk (state=3): >>><<< 11081 1726776687.60925: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:27.489400", "end": "2024-09-19 16:11:27.594803", "delta": "0:00:00.105403", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11081 1726776687.60968: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11081 1726776687.60979: _low_level_execute_command(): starting 11081 1726776687.60985: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776687.2207417-11081-204455755959771/ > /dev/null 2>&1 && sleep 0' 11081 1726776687.63418: stderr chunk (state=2): >>><<< 11081 1726776687.63425: stdout chunk (state=2): >>><<< 11081 1726776687.63440: _low_level_execute_command() done: rc=0, stdout=, stderr= 11081 1726776687.63446: handler run complete 11081 1726776687.63464: Evaluated conditional (False): False 11081 1726776687.63474: attempt loop complete, returning result 11081 1726776687.63478: _execute() done 11081 1726776687.63481: dumping result to json 11081 1726776687.63487: done dumping result, returning 11081 1726776687.63494: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [120fa90a-8a95-cec2-986e-0000000007f9] 11081 1726776687.63500: sending task result for task 120fa90a-8a95-cec2-986e-0000000007f9 11081 1726776687.63536: done sending task result for task 120fa90a-8a95-cec2-986e-0000000007f9 11081 1726776687.63540: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.105403", "end": "2024-09-19 16:11:27.594803", "rc": 0, "start": "2024-09-19 16:11:27.489400" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8218 1726776687.63702: no more pending results, returning what we have 8218 1726776687.63705: results queue empty 8218 1726776687.63705: checking for any_errors_fatal 8218 1726776687.63707: done checking for any_errors_fatal 8218 1726776687.63707: checking for max_fail_percentage 8218 1726776687.63709: done checking for max_fail_percentage 8218 1726776687.63709: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.63710: done checking to see if all hosts have failed 8218 1726776687.63711: getting the remaining hosts for this loop 8218 1726776687.63711: done getting the remaining hosts for this loop 8218 1726776687.63714: getting the next task for host managed_node2 8218 1726776687.63720: done getting next task for host managed_node2 8218 1726776687.63722: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8218 1726776687.63726: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.63737: getting variables 8218 1726776687.63738: in VariableManager get_vars() 8218 1726776687.63773: Calling all_inventory to load vars for managed_node2 8218 1726776687.63776: Calling groups_inventory to load vars for managed_node2 8218 1726776687.63777: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.63784: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.63786: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.63787: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.63902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.64052: done with get_vars() 8218 1726776687.64063: done getting variables 8218 1726776687.64104: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.459) 0:01:13.471 **** 8218 1726776687.64131: entering _queue_task() for managed_node2/shell 8218 1726776687.64296: worker is 1 (out of 1 available) 8218 1726776687.64310: exiting _queue_task() for managed_node2/shell 8218 1726776687.64321: done queuing things up, now waiting for results queue to drain 8218 1726776687.64323: waiting for pending results... 11128 1726776687.64448: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11128 1726776687.64572: in run() - task 120fa90a-8a95-cec2-986e-0000000007fa 11128 1726776687.64588: variable 'ansible_search_path' from source: unknown 11128 1726776687.64592: variable 'ansible_search_path' from source: unknown 11128 1726776687.64620: calling self._execute() 11128 1726776687.64685: variable 'ansible_host' from source: host vars for 'managed_node2' 11128 1726776687.64693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11128 1726776687.64702: variable 'omit' from source: magic vars 11128 1726776687.65075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11128 1726776687.65297: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11128 1726776687.65346: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11128 1726776687.65379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11128 1726776687.65406: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11128 1726776687.65512: variable '__kernel_settings_register_verify_values' from source: set_fact 11128 1726776687.65533: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11128 1726776687.65538: when evaluation is False, skipping this task 11128 1726776687.65542: _execute() done 11128 1726776687.65546: dumping result to json 11128 1726776687.65550: done dumping result, returning 11128 1726776687.65555: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [120fa90a-8a95-cec2-986e-0000000007fa] 11128 1726776687.65562: sending task result for task 120fa90a-8a95-cec2-986e-0000000007fa 11128 1726776687.65584: done sending task result for task 120fa90a-8a95-cec2-986e-0000000007fa 11128 1726776687.65587: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776687.65689: no more pending results, returning what we have 8218 1726776687.65692: results queue empty 8218 1726776687.65693: checking for any_errors_fatal 8218 1726776687.65700: done checking for any_errors_fatal 8218 1726776687.65701: checking for max_fail_percentage 8218 1726776687.65702: done checking for max_fail_percentage 8218 1726776687.65703: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.65703: done checking to see if all hosts have failed 8218 1726776687.65704: getting the remaining hosts for this loop 8218 1726776687.65705: done getting the remaining hosts for this loop 8218 1726776687.65708: getting the next task for host managed_node2 8218 1726776687.65713: done getting next task for host managed_node2 8218 1726776687.65716: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8218 1726776687.65719: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.65734: getting variables 8218 1726776687.65735: in VariableManager get_vars() 8218 1726776687.65765: Calling all_inventory to load vars for managed_node2 8218 1726776687.65767: Calling groups_inventory to load vars for managed_node2 8218 1726776687.65768: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.65774: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.65776: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.65778: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.65880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.66001: done with get_vars() 8218 1726776687.66009: done getting variables 8218 1726776687.66050: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.019) 0:01:13.491 **** 8218 1726776687.66075: entering _queue_task() for managed_node2/fail 8218 1726776687.66216: worker is 1 (out of 1 available) 8218 1726776687.66231: exiting _queue_task() for managed_node2/fail 8218 1726776687.66243: done queuing things up, now waiting for results queue to drain 8218 1726776687.66245: waiting for pending results... 11129 1726776687.66363: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11129 1726776687.66483: in run() - task 120fa90a-8a95-cec2-986e-0000000007fb 11129 1726776687.66498: variable 'ansible_search_path' from source: unknown 11129 1726776687.66502: variable 'ansible_search_path' from source: unknown 11129 1726776687.66526: calling self._execute() 11129 1726776687.66590: variable 'ansible_host' from source: host vars for 'managed_node2' 11129 1726776687.66596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11129 1726776687.66602: variable 'omit' from source: magic vars 11129 1726776687.66910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11129 1726776687.67137: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11129 1726776687.67170: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11129 1726776687.67196: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11129 1726776687.67222: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11129 1726776687.67304: variable '__kernel_settings_register_verify_values' from source: set_fact 11129 1726776687.67321: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11129 1726776687.67324: when evaluation is False, skipping this task 11129 1726776687.67326: _execute() done 11129 1726776687.67330: dumping result to json 11129 1726776687.67333: done dumping result, returning 11129 1726776687.67337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [120fa90a-8a95-cec2-986e-0000000007fb] 11129 1726776687.67342: sending task result for task 120fa90a-8a95-cec2-986e-0000000007fb 11129 1726776687.67361: done sending task result for task 120fa90a-8a95-cec2-986e-0000000007fb 11129 1726776687.67363: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776687.67574: no more pending results, returning what we have 8218 1726776687.67577: results queue empty 8218 1726776687.67577: checking for any_errors_fatal 8218 1726776687.67581: done checking for any_errors_fatal 8218 1726776687.67582: checking for max_fail_percentage 8218 1726776687.67583: done checking for max_fail_percentage 8218 1726776687.67583: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.67584: done checking to see if all hosts have failed 8218 1726776687.67584: getting the remaining hosts for this loop 8218 1726776687.67585: done getting the remaining hosts for this loop 8218 1726776687.67587: getting the next task for host managed_node2 8218 1726776687.67591: done getting next task for host managed_node2 8218 1726776687.67593: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8218 1726776687.67595: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.67605: getting variables 8218 1726776687.67606: in VariableManager get_vars() 8218 1726776687.67627: Calling all_inventory to load vars for managed_node2 8218 1726776687.67631: Calling groups_inventory to load vars for managed_node2 8218 1726776687.67632: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.67638: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.67639: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.67641: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.67744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.67897: done with get_vars() 8218 1726776687.67904: done getting variables 8218 1726776687.67945: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.018) 0:01:13.510 **** 8218 1726776687.67968: entering _queue_task() for managed_node2/set_fact 8218 1726776687.68107: worker is 1 (out of 1 available) 8218 1726776687.68121: exiting _queue_task() for managed_node2/set_fact 8218 1726776687.68134: done queuing things up, now waiting for results queue to drain 8218 1726776687.68135: waiting for pending results... 11130 1726776687.68253: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11130 1726776687.68358: in run() - task 120fa90a-8a95-cec2-986e-000000000623 11130 1726776687.68373: variable 'ansible_search_path' from source: unknown 11130 1726776687.68377: variable 'ansible_search_path' from source: unknown 11130 1726776687.68402: calling self._execute() 11130 1726776687.68465: variable 'ansible_host' from source: host vars for 'managed_node2' 11130 1726776687.68475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11130 1726776687.68483: variable 'omit' from source: magic vars 11130 1726776687.68554: variable 'omit' from source: magic vars 11130 1726776687.68590: variable 'omit' from source: magic vars 11130 1726776687.68612: variable 'omit' from source: magic vars 11130 1726776687.68644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11130 1726776687.68670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11130 1726776687.68689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11130 1726776687.68703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11130 1726776687.68713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11130 1726776687.68737: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11130 1726776687.68742: variable 'ansible_host' from source: host vars for 'managed_node2' 11130 1726776687.68747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11130 1726776687.68813: Set connection var ansible_connection to ssh 11130 1726776687.68821: Set connection var ansible_pipelining to False 11130 1726776687.68828: Set connection var ansible_timeout to 10 11130 1726776687.68836: Set connection var ansible_module_compression to ZIP_DEFLATED 11130 1726776687.68842: Set connection var ansible_shell_type to sh 11130 1726776687.68847: Set connection var ansible_shell_executable to /bin/sh 11130 1726776687.68862: variable 'ansible_shell_executable' from source: unknown 11130 1726776687.68866: variable 'ansible_connection' from source: unknown 11130 1726776687.68871: variable 'ansible_module_compression' from source: unknown 11130 1726776687.68874: variable 'ansible_shell_type' from source: unknown 11130 1726776687.68877: variable 'ansible_shell_executable' from source: unknown 11130 1726776687.68881: variable 'ansible_host' from source: host vars for 'managed_node2' 11130 1726776687.68884: variable 'ansible_pipelining' from source: unknown 11130 1726776687.68888: variable 'ansible_timeout' from source: unknown 11130 1726776687.68892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11130 1726776687.68981: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11130 1726776687.68994: variable 'omit' from source: magic vars 11130 1726776687.69000: starting attempt loop 11130 1726776687.69004: running the handler 11130 1726776687.69014: handler run complete 11130 1726776687.69024: attempt loop complete, returning result 11130 1726776687.69027: _execute() done 11130 1726776687.69031: dumping result to json 11130 1726776687.69034: done dumping result, returning 11130 1726776687.69040: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-000000000623] 11130 1726776687.69046: sending task result for task 120fa90a-8a95-cec2-986e-000000000623 11130 1726776687.69066: done sending task result for task 120fa90a-8a95-cec2-986e-000000000623 11130 1726776687.69070: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8218 1726776687.69179: no more pending results, returning what we have 8218 1726776687.69182: results queue empty 8218 1726776687.69183: checking for any_errors_fatal 8218 1726776687.69187: done checking for any_errors_fatal 8218 1726776687.69188: checking for max_fail_percentage 8218 1726776687.69189: done checking for max_fail_percentage 8218 1726776687.69190: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.69191: done checking to see if all hosts have failed 8218 1726776687.69191: getting the remaining hosts for this loop 8218 1726776687.69192: done getting the remaining hosts for this loop 8218 1726776687.69195: getting the next task for host managed_node2 8218 1726776687.69200: done getting next task for host managed_node2 8218 1726776687.69202: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8218 1726776687.69204: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.69212: getting variables 8218 1726776687.69213: in VariableManager get_vars() 8218 1726776687.69240: Calling all_inventory to load vars for managed_node2 8218 1726776687.69242: Calling groups_inventory to load vars for managed_node2 8218 1726776687.69243: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.69248: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.69250: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.69252: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.69350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.69472: done with get_vars() 8218 1726776687.69479: done getting variables 8218 1726776687.69516: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.015) 0:01:13.525 **** 8218 1726776687.69538: entering _queue_task() for managed_node2/set_fact 8218 1726776687.69676: worker is 1 (out of 1 available) 8218 1726776687.69688: exiting _queue_task() for managed_node2/set_fact 8218 1726776687.69699: done queuing things up, now waiting for results queue to drain 8218 1726776687.69700: waiting for pending results... 11131 1726776687.69808: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11131 1726776687.69914: in run() - task 120fa90a-8a95-cec2-986e-000000000624 11131 1726776687.69931: variable 'ansible_search_path' from source: unknown 11131 1726776687.69935: variable 'ansible_search_path' from source: unknown 11131 1726776687.69959: calling self._execute() 11131 1726776687.70020: variable 'ansible_host' from source: host vars for 'managed_node2' 11131 1726776687.70030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11131 1726776687.70039: variable 'omit' from source: magic vars 11131 1726776687.70105: variable 'omit' from source: magic vars 11131 1726776687.70143: variable 'omit' from source: magic vars 11131 1726776687.70390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11131 1726776687.70616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11131 1726776687.70650: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11131 1726776687.70676: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11131 1726776687.70703: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11131 1726776687.70802: variable '__kernel_settings_register_profile' from source: set_fact 11131 1726776687.70815: variable '__kernel_settings_register_mode' from source: set_fact 11131 1726776687.70823: variable '__kernel_settings_register_apply' from source: set_fact 11131 1726776687.70860: variable 'omit' from source: magic vars 11131 1726776687.70880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11131 1726776687.70901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11131 1726776687.70916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11131 1726776687.70927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11131 1726776687.70943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11131 1726776687.70964: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11131 1726776687.70970: variable 'ansible_host' from source: host vars for 'managed_node2' 11131 1726776687.70974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11131 1726776687.71037: Set connection var ansible_connection to ssh 11131 1726776687.71045: Set connection var ansible_pipelining to False 11131 1726776687.71051: Set connection var ansible_timeout to 10 11131 1726776687.71059: Set connection var ansible_module_compression to ZIP_DEFLATED 11131 1726776687.71065: Set connection var ansible_shell_type to sh 11131 1726776687.71070: Set connection var ansible_shell_executable to /bin/sh 11131 1726776687.71084: variable 'ansible_shell_executable' from source: unknown 11131 1726776687.71088: variable 'ansible_connection' from source: unknown 11131 1726776687.71091: variable 'ansible_module_compression' from source: unknown 11131 1726776687.71094: variable 'ansible_shell_type' from source: unknown 11131 1726776687.71098: variable 'ansible_shell_executable' from source: unknown 11131 1726776687.71101: variable 'ansible_host' from source: host vars for 'managed_node2' 11131 1726776687.71105: variable 'ansible_pipelining' from source: unknown 11131 1726776687.71108: variable 'ansible_timeout' from source: unknown 11131 1726776687.71112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11131 1726776687.71179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11131 1726776687.71191: variable 'omit' from source: magic vars 11131 1726776687.71196: starting attempt loop 11131 1726776687.71200: running the handler 11131 1726776687.71209: handler run complete 11131 1726776687.71218: attempt loop complete, returning result 11131 1726776687.71221: _execute() done 11131 1726776687.71224: dumping result to json 11131 1726776687.71227: done dumping result, returning 11131 1726776687.71235: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-cec2-986e-000000000624] 11131 1726776687.71242: sending task result for task 120fa90a-8a95-cec2-986e-000000000624 11131 1726776687.71262: done sending task result for task 120fa90a-8a95-cec2-986e-000000000624 11131 1726776687.71265: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8218 1726776687.71392: no more pending results, returning what we have 8218 1726776687.71395: results queue empty 8218 1726776687.71396: checking for any_errors_fatal 8218 1726776687.71400: done checking for any_errors_fatal 8218 1726776687.71401: checking for max_fail_percentage 8218 1726776687.71402: done checking for max_fail_percentage 8218 1726776687.71403: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.71404: done checking to see if all hosts have failed 8218 1726776687.71404: getting the remaining hosts for this loop 8218 1726776687.71405: done getting the remaining hosts for this loop 8218 1726776687.71408: getting the next task for host managed_node2 8218 1726776687.71415: done getting next task for host managed_node2 8218 1726776687.71417: ^ task is: TASK: meta (role_complete) 8218 1726776687.71419: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.71427: getting variables 8218 1726776687.71430: in VariableManager get_vars() 8218 1726776687.71453: Calling all_inventory to load vars for managed_node2 8218 1726776687.71455: Calling groups_inventory to load vars for managed_node2 8218 1726776687.71456: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.71463: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.71465: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.71467: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.71572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.71719: done with get_vars() 8218 1726776687.71725: done getting variables 8218 1726776687.71779: done queuing things up, now waiting for results queue to drain 8218 1726776687.71780: results queue empty 8218 1726776687.71780: checking for any_errors_fatal 8218 1726776687.71782: done checking for any_errors_fatal 8218 1726776687.71783: checking for max_fail_percentage 8218 1726776687.71783: done checking for max_fail_percentage 8218 1726776687.71787: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.71787: done checking to see if all hosts have failed 8218 1726776687.71787: getting the remaining hosts for this loop 8218 1726776687.71788: done getting the remaining hosts for this loop 8218 1726776687.71789: getting the next task for host managed_node2 8218 1726776687.71791: done getting next task for host managed_node2 8218 1726776687.71792: ^ task is: TASK: meta (flush_handlers) 8218 1726776687.71793: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.71796: getting variables 8218 1726776687.71796: in VariableManager get_vars() 8218 1726776687.71803: Calling all_inventory to load vars for managed_node2 8218 1726776687.71804: Calling groups_inventory to load vars for managed_node2 8218 1726776687.71805: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.71808: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.71809: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.71811: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.71888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.71989: done with get_vars() 8218 1726776687.71995: done getting variables TASK [Force handlers] ********************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:159 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.024) 0:01:13.550 **** 8218 1726776687.72036: in VariableManager get_vars() 8218 1726776687.72043: Calling all_inventory to load vars for managed_node2 8218 1726776687.72044: Calling groups_inventory to load vars for managed_node2 8218 1726776687.72045: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.72048: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.72049: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.72050: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.72126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.72230: done with get_vars() META: triggered running handlers for managed_node2 8218 1726776687.72240: done queuing things up, now waiting for results queue to drain 8218 1726776687.72241: results queue empty 8218 1726776687.72241: checking for any_errors_fatal 8218 1726776687.72242: done checking for any_errors_fatal 8218 1726776687.72243: checking for max_fail_percentage 8218 1726776687.72243: done checking for max_fail_percentage 8218 1726776687.72244: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.72244: done checking to see if all hosts have failed 8218 1726776687.72244: getting the remaining hosts for this loop 8218 1726776687.72245: done getting the remaining hosts for this loop 8218 1726776687.72246: getting the next task for host managed_node2 8218 1726776687.72248: done getting next task for host managed_node2 8218 1726776687.72249: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8218 1726776687.72250: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.72252: getting variables 8218 1726776687.72252: in VariableManager get_vars() 8218 1726776687.72260: Calling all_inventory to load vars for managed_node2 8218 1726776687.72261: Calling groups_inventory to load vars for managed_node2 8218 1726776687.72262: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.72265: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.72266: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.72268: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.72365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.72474: done with get_vars() 8218 1726776687.72480: done getting variables 8218 1726776687.72503: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:162 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.004) 0:01:13.555 **** 8218 1726776687.72516: entering _queue_task() for managed_node2/assert 8218 1726776687.72665: worker is 1 (out of 1 available) 8218 1726776687.72678: exiting _queue_task() for managed_node2/assert 8218 1726776687.72691: done queuing things up, now waiting for results queue to drain 8218 1726776687.72693: waiting for pending results... 11132 1726776687.72809: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 11132 1726776687.72905: in run() - task 120fa90a-8a95-cec2-986e-000000000021 11132 1726776687.72920: variable 'ansible_search_path' from source: unknown 11132 1726776687.72949: calling self._execute() 11132 1726776687.73014: variable 'ansible_host' from source: host vars for 'managed_node2' 11132 1726776687.73023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11132 1726776687.73033: variable 'omit' from source: magic vars 11132 1726776687.73106: variable 'omit' from source: magic vars 11132 1726776687.73135: variable 'omit' from source: magic vars 11132 1726776687.73159: variable 'omit' from source: magic vars 11132 1726776687.73191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11132 1726776687.73218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11132 1726776687.73238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11132 1726776687.73253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11132 1726776687.73265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11132 1726776687.73287: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11132 1726776687.73293: variable 'ansible_host' from source: host vars for 'managed_node2' 11132 1726776687.73297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11132 1726776687.73364: Set connection var ansible_connection to ssh 11132 1726776687.73372: Set connection var ansible_pipelining to False 11132 1726776687.73378: Set connection var ansible_timeout to 10 11132 1726776687.73385: Set connection var ansible_module_compression to ZIP_DEFLATED 11132 1726776687.73390: Set connection var ansible_shell_type to sh 11132 1726776687.73395: Set connection var ansible_shell_executable to /bin/sh 11132 1726776687.73410: variable 'ansible_shell_executable' from source: unknown 11132 1726776687.73415: variable 'ansible_connection' from source: unknown 11132 1726776687.73419: variable 'ansible_module_compression' from source: unknown 11132 1726776687.73422: variable 'ansible_shell_type' from source: unknown 11132 1726776687.73426: variable 'ansible_shell_executable' from source: unknown 11132 1726776687.73430: variable 'ansible_host' from source: host vars for 'managed_node2' 11132 1726776687.73435: variable 'ansible_pipelining' from source: unknown 11132 1726776687.73438: variable 'ansible_timeout' from source: unknown 11132 1726776687.73442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11132 1726776687.73530: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11132 1726776687.73542: variable 'omit' from source: magic vars 11132 1726776687.73549: starting attempt loop 11132 1726776687.73552: running the handler 11132 1726776687.73798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11132 1726776687.75325: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11132 1726776687.75371: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11132 1726776687.75398: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11132 1726776687.75426: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11132 1726776687.75448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11132 1726776687.75495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11132 1726776687.75515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11132 1726776687.75536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11132 1726776687.75565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11132 1726776687.75577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11132 1726776687.75659: variable 'kernel_settings_reboot_required' from source: set_fact 11132 1726776687.75675: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 11132 1726776687.75682: handler run complete 11132 1726776687.75698: attempt loop complete, returning result 11132 1726776687.75702: _execute() done 11132 1726776687.75704: dumping result to json 11132 1726776687.75708: done dumping result, returning 11132 1726776687.75715: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [120fa90a-8a95-cec2-986e-000000000021] 11132 1726776687.75720: sending task result for task 120fa90a-8a95-cec2-986e-000000000021 11132 1726776687.75744: done sending task result for task 120fa90a-8a95-cec2-986e-000000000021 11132 1726776687.75748: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776687.75887: no more pending results, returning what we have 8218 1726776687.75890: results queue empty 8218 1726776687.75890: checking for any_errors_fatal 8218 1726776687.75892: done checking for any_errors_fatal 8218 1726776687.75893: checking for max_fail_percentage 8218 1726776687.75895: done checking for max_fail_percentage 8218 1726776687.75895: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.75896: done checking to see if all hosts have failed 8218 1726776687.75896: getting the remaining hosts for this loop 8218 1726776687.75897: done getting the remaining hosts for this loop 8218 1726776687.75900: getting the next task for host managed_node2 8218 1726776687.75905: done getting next task for host managed_node2 8218 1726776687.75907: ^ task is: TASK: Ensure role reported changed 8218 1726776687.75909: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.75911: getting variables 8218 1726776687.75913: in VariableManager get_vars() 8218 1726776687.75945: Calling all_inventory to load vars for managed_node2 8218 1726776687.75947: Calling groups_inventory to load vars for managed_node2 8218 1726776687.75949: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.75960: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.75967: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.75969: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.76076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.76196: done with get_vars() 8218 1726776687.76204: done getting variables 8218 1726776687.76247: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:166 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.037) 0:01:13.593 **** 8218 1726776687.76268: entering _queue_task() for managed_node2/assert 8218 1726776687.76414: worker is 1 (out of 1 available) 8218 1726776687.76430: exiting _queue_task() for managed_node2/assert 8218 1726776687.76442: done queuing things up, now waiting for results queue to drain 8218 1726776687.76444: waiting for pending results... 11133 1726776687.76564: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 11133 1726776687.76658: in run() - task 120fa90a-8a95-cec2-986e-000000000022 11133 1726776687.76675: variable 'ansible_search_path' from source: unknown 11133 1726776687.76702: calling self._execute() 11133 1726776687.76768: variable 'ansible_host' from source: host vars for 'managed_node2' 11133 1726776687.76839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11133 1726776687.76850: variable 'omit' from source: magic vars 11133 1726776687.76922: variable 'omit' from source: magic vars 11133 1726776687.76947: variable 'omit' from source: magic vars 11133 1726776687.76969: variable 'omit' from source: magic vars 11133 1726776687.77000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11133 1726776687.77025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11133 1726776687.77044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11133 1726776687.77054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11133 1726776687.77063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11133 1726776687.77083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11133 1726776687.77086: variable 'ansible_host' from source: host vars for 'managed_node2' 11133 1726776687.77089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11133 1726776687.77152: Set connection var ansible_connection to ssh 11133 1726776687.77159: Set connection var ansible_pipelining to False 11133 1726776687.77163: Set connection var ansible_timeout to 10 11133 1726776687.77168: Set connection var ansible_module_compression to ZIP_DEFLATED 11133 1726776687.77171: Set connection var ansible_shell_type to sh 11133 1726776687.77174: Set connection var ansible_shell_executable to /bin/sh 11133 1726776687.77186: variable 'ansible_shell_executable' from source: unknown 11133 1726776687.77189: variable 'ansible_connection' from source: unknown 11133 1726776687.77190: variable 'ansible_module_compression' from source: unknown 11133 1726776687.77192: variable 'ansible_shell_type' from source: unknown 11133 1726776687.77194: variable 'ansible_shell_executable' from source: unknown 11133 1726776687.77196: variable 'ansible_host' from source: host vars for 'managed_node2' 11133 1726776687.77198: variable 'ansible_pipelining' from source: unknown 11133 1726776687.77199: variable 'ansible_timeout' from source: unknown 11133 1726776687.77201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11133 1726776687.77286: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11133 1726776687.77296: variable 'omit' from source: magic vars 11133 1726776687.77300: starting attempt loop 11133 1726776687.77302: running the handler 11133 1726776687.77548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11133 1726776687.79078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11133 1726776687.79133: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11133 1726776687.79164: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11133 1726776687.79191: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11133 1726776687.79210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11133 1726776687.79262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11133 1726776687.79282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11133 1726776687.79300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11133 1726776687.79326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11133 1726776687.79339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11133 1726776687.79414: variable '__kernel_settings_changed' from source: set_fact 11133 1726776687.79432: Evaluated conditional (__kernel_settings_changed | d(false)): True 11133 1726776687.79439: handler run complete 11133 1726776687.79461: attempt loop complete, returning result 11133 1726776687.79466: _execute() done 11133 1726776687.79469: dumping result to json 11133 1726776687.79472: done dumping result, returning 11133 1726776687.79478: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [120fa90a-8a95-cec2-986e-000000000022] 11133 1726776687.79485: sending task result for task 120fa90a-8a95-cec2-986e-000000000022 11133 1726776687.79518: done sending task result for task 120fa90a-8a95-cec2-986e-000000000022 11133 1726776687.79522: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776687.79659: no more pending results, returning what we have 8218 1726776687.79662: results queue empty 8218 1726776687.79663: checking for any_errors_fatal 8218 1726776687.79669: done checking for any_errors_fatal 8218 1726776687.79670: checking for max_fail_percentage 8218 1726776687.79671: done checking for max_fail_percentage 8218 1726776687.79672: checking to see if all hosts have failed and the running result is not ok 8218 1726776687.79673: done checking to see if all hosts have failed 8218 1726776687.79673: getting the remaining hosts for this loop 8218 1726776687.79674: done getting the remaining hosts for this loop 8218 1726776687.79677: getting the next task for host managed_node2 8218 1726776687.79681: done getting next task for host managed_node2 8218 1726776687.79683: ^ task is: TASK: Check sysctl 8218 1726776687.79685: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776687.79687: getting variables 8218 1726776687.79689: in VariableManager get_vars() 8218 1726776687.79784: Calling all_inventory to load vars for managed_node2 8218 1726776687.79787: Calling groups_inventory to load vars for managed_node2 8218 1726776687.79789: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776687.79797: Calling all_plugins_play to load vars for managed_node2 8218 1726776687.79799: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776687.79807: Calling groups_plugins_play to load vars for managed_node2 8218 1726776687.79957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776687.80141: done with get_vars() 8218 1726776687.80152: done getting variables 8218 1726776687.80209: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysctl] ************************************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:170 Thursday 19 September 2024 16:11:27 -0400 (0:00:00.039) 0:01:13.632 **** 8218 1726776687.80240: entering _queue_task() for managed_node2/shell 8218 1726776687.80436: worker is 1 (out of 1 available) 8218 1726776687.80449: exiting _queue_task() for managed_node2/shell 8218 1726776687.80460: done queuing things up, now waiting for results queue to drain 8218 1726776687.80462: waiting for pending results... 11136 1726776687.80674: running TaskExecutor() for managed_node2/TASK: Check sysctl 11136 1726776687.80783: in run() - task 120fa90a-8a95-cec2-986e-000000000023 11136 1726776687.80798: variable 'ansible_search_path' from source: unknown 11136 1726776687.80826: calling self._execute() 11136 1726776687.80894: variable 'ansible_host' from source: host vars for 'managed_node2' 11136 1726776687.80903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11136 1726776687.80916: variable 'omit' from source: magic vars 11136 1726776687.80998: variable 'omit' from source: magic vars 11136 1726776687.81025: variable 'omit' from source: magic vars 11136 1726776687.81047: variable 'omit' from source: magic vars 11136 1726776687.81076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11136 1726776687.81099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11136 1726776687.81116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11136 1726776687.81130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11136 1726776687.81140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11136 1726776687.81163: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11136 1726776687.81167: variable 'ansible_host' from source: host vars for 'managed_node2' 11136 1726776687.81170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11136 1726776687.81233: Set connection var ansible_connection to ssh 11136 1726776687.81239: Set connection var ansible_pipelining to False 11136 1726776687.81243: Set connection var ansible_timeout to 10 11136 1726776687.81247: Set connection var ansible_module_compression to ZIP_DEFLATED 11136 1726776687.81252: Set connection var ansible_shell_type to sh 11136 1726776687.81255: Set connection var ansible_shell_executable to /bin/sh 11136 1726776687.81269: variable 'ansible_shell_executable' from source: unknown 11136 1726776687.81272: variable 'ansible_connection' from source: unknown 11136 1726776687.81274: variable 'ansible_module_compression' from source: unknown 11136 1726776687.81276: variable 'ansible_shell_type' from source: unknown 11136 1726776687.81277: variable 'ansible_shell_executable' from source: unknown 11136 1726776687.81279: variable 'ansible_host' from source: host vars for 'managed_node2' 11136 1726776687.81281: variable 'ansible_pipelining' from source: unknown 11136 1726776687.81283: variable 'ansible_timeout' from source: unknown 11136 1726776687.81285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11136 1726776687.81386: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11136 1726776687.81399: variable 'omit' from source: magic vars 11136 1726776687.81405: starting attempt loop 11136 1726776687.81409: running the handler 11136 1726776687.81419: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11136 1726776687.81437: _low_level_execute_command(): starting 11136 1726776687.81446: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11136 1726776687.83781: stdout chunk (state=2): >>>/root <<< 11136 1726776687.83918: stderr chunk (state=3): >>><<< 11136 1726776687.83925: stdout chunk (state=3): >>><<< 11136 1726776687.83947: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11136 1726776687.83964: _low_level_execute_command(): starting 11136 1726776687.83970: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867 `" && echo ansible-tmp-1726776687.8395584-11136-150247608775867="` echo /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867 `" ) && sleep 0' 11136 1726776687.86707: stdout chunk (state=2): >>>ansible-tmp-1726776687.8395584-11136-150247608775867=/root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867 <<< 11136 1726776687.86835: stderr chunk (state=3): >>><<< 11136 1726776687.86842: stdout chunk (state=3): >>><<< 11136 1726776687.86857: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776687.8395584-11136-150247608775867=/root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867 , stderr= 11136 1726776687.86884: variable 'ansible_module_compression' from source: unknown 11136 1726776687.86926: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11136 1726776687.86962: variable 'ansible_facts' from source: unknown 11136 1726776687.87038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867/AnsiballZ_command.py 11136 1726776687.87137: Sending initial data 11136 1726776687.87144: Sent initial data (155 bytes) 11136 1726776687.89604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpxp_eh9x0 /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867/AnsiballZ_command.py <<< 11136 1726776687.90674: stderr chunk (state=3): >>><<< 11136 1726776687.90681: stdout chunk (state=3): >>><<< 11136 1726776687.90698: done transferring module to remote 11136 1726776687.90708: _low_level_execute_command(): starting 11136 1726776687.90713: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867/ /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867/AnsiballZ_command.py && sleep 0' 11136 1726776687.93031: stderr chunk (state=2): >>><<< 11136 1726776687.93040: stdout chunk (state=2): >>><<< 11136 1726776687.93054: _low_level_execute_command() done: rc=0, stdout=, stderr= 11136 1726776687.93060: _low_level_execute_command(): starting 11136 1726776687.93066: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867/AnsiballZ_command.py && sleep 0' 11136 1726776688.08836: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "start": "2024-09-19 16:11:28.080307", "end": "2024-09-19 16:11:28.086740", "delta": "0:00:00.006433", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11136 1726776688.09987: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11136 1726776688.10037: stderr chunk (state=3): >>><<< 11136 1726776688.10044: stdout chunk (state=3): >>><<< 11136 1726776688.10061: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "start": "2024-09-19 16:11:28.080307", "end": "2024-09-19 16:11:28.086740", "delta": "0:00:00.006433", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11136 1726776688.10102: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11136 1726776688.10114: _low_level_execute_command(): starting 11136 1726776688.10121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776687.8395584-11136-150247608775867/ > /dev/null 2>&1 && sleep 0' 11136 1726776688.12495: stderr chunk (state=2): >>><<< 11136 1726776688.12503: stdout chunk (state=2): >>><<< 11136 1726776688.12516: _low_level_execute_command() done: rc=0, stdout=, stderr= 11136 1726776688.12523: handler run complete 11136 1726776688.12542: Evaluated conditional (False): False 11136 1726776688.12551: attempt loop complete, returning result 11136 1726776688.12555: _execute() done 11136 1726776688.12558: dumping result to json 11136 1726776688.12563: done dumping result, returning 11136 1726776688.12570: done running TaskExecutor() for managed_node2/TASK: Check sysctl [120fa90a-8a95-cec2-986e-000000000023] 11136 1726776688.12577: sending task result for task 120fa90a-8a95-cec2-986e-000000000023 11136 1726776688.12608: done sending task result for task 120fa90a-8a95-cec2-986e-000000000023 11136 1726776688.12612: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "delta": "0:00:00.006433", "end": "2024-09-19 16:11:28.086740", "rc": 0, "start": "2024-09-19 16:11:28.080307" } 8218 1726776688.12745: no more pending results, returning what we have 8218 1726776688.12748: results queue empty 8218 1726776688.12749: checking for any_errors_fatal 8218 1726776688.12754: done checking for any_errors_fatal 8218 1726776688.12754: checking for max_fail_percentage 8218 1726776688.12756: done checking for max_fail_percentage 8218 1726776688.12756: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.12759: done checking to see if all hosts have failed 8218 1726776688.12760: getting the remaining hosts for this loop 8218 1726776688.12761: done getting the remaining hosts for this loop 8218 1726776688.12764: getting the next task for host managed_node2 8218 1726776688.12769: done getting next task for host managed_node2 8218 1726776688.12771: ^ task is: TASK: Check sysfs after role runs 8218 1726776688.12773: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.12776: getting variables 8218 1726776688.12777: in VariableManager get_vars() 8218 1726776688.12808: Calling all_inventory to load vars for managed_node2 8218 1726776688.12811: Calling groups_inventory to load vars for managed_node2 8218 1726776688.12813: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.12821: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.12824: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.12826: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.12944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.13085: done with get_vars() 8218 1726776688.13093: done getting variables 8218 1726776688.13136: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:176 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.329) 0:01:13.962 **** 8218 1726776688.13159: entering _queue_task() for managed_node2/command 8218 1726776688.13315: worker is 1 (out of 1 available) 8218 1726776688.13331: exiting _queue_task() for managed_node2/command 8218 1726776688.13343: done queuing things up, now waiting for results queue to drain 8218 1726776688.13344: waiting for pending results... 11155 1726776688.13466: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 11155 1726776688.13567: in run() - task 120fa90a-8a95-cec2-986e-000000000024 11155 1726776688.13583: variable 'ansible_search_path' from source: unknown 11155 1726776688.13612: calling self._execute() 11155 1726776688.13679: variable 'ansible_host' from source: host vars for 'managed_node2' 11155 1726776688.13687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11155 1726776688.13697: variable 'omit' from source: magic vars 11155 1726776688.13779: variable 'omit' from source: magic vars 11155 1726776688.13804: variable 'omit' from source: magic vars 11155 1726776688.13824: variable 'omit' from source: magic vars 11155 1726776688.13856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11155 1726776688.13881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11155 1726776688.13898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11155 1726776688.13910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11155 1726776688.13919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11155 1726776688.13943: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11155 1726776688.13948: variable 'ansible_host' from source: host vars for 'managed_node2' 11155 1726776688.13950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11155 1726776688.14014: Set connection var ansible_connection to ssh 11155 1726776688.14020: Set connection var ansible_pipelining to False 11155 1726776688.14024: Set connection var ansible_timeout to 10 11155 1726776688.14030: Set connection var ansible_module_compression to ZIP_DEFLATED 11155 1726776688.14033: Set connection var ansible_shell_type to sh 11155 1726776688.14036: Set connection var ansible_shell_executable to /bin/sh 11155 1726776688.14049: variable 'ansible_shell_executable' from source: unknown 11155 1726776688.14053: variable 'ansible_connection' from source: unknown 11155 1726776688.14054: variable 'ansible_module_compression' from source: unknown 11155 1726776688.14056: variable 'ansible_shell_type' from source: unknown 11155 1726776688.14058: variable 'ansible_shell_executable' from source: unknown 11155 1726776688.14060: variable 'ansible_host' from source: host vars for 'managed_node2' 11155 1726776688.14063: variable 'ansible_pipelining' from source: unknown 11155 1726776688.14064: variable 'ansible_timeout' from source: unknown 11155 1726776688.14066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11155 1726776688.14173: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11155 1726776688.14186: variable 'omit' from source: magic vars 11155 1726776688.14192: starting attempt loop 11155 1726776688.14196: running the handler 11155 1726776688.14209: _low_level_execute_command(): starting 11155 1726776688.14217: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11155 1726776688.16576: stdout chunk (state=2): >>>/root <<< 11155 1726776688.16696: stderr chunk (state=3): >>><<< 11155 1726776688.16703: stdout chunk (state=3): >>><<< 11155 1726776688.16721: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11155 1726776688.16736: _low_level_execute_command(): starting 11155 1726776688.16741: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093 `" && echo ansible-tmp-1726776688.1673048-11155-134671115796093="` echo /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093 `" ) && sleep 0' 11155 1726776688.19554: stdout chunk (state=2): >>>ansible-tmp-1726776688.1673048-11155-134671115796093=/root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093 <<< 11155 1726776688.19689: stderr chunk (state=3): >>><<< 11155 1726776688.19697: stdout chunk (state=3): >>><<< 11155 1726776688.19713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776688.1673048-11155-134671115796093=/root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093 , stderr= 11155 1726776688.19741: variable 'ansible_module_compression' from source: unknown 11155 1726776688.19785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11155 1726776688.19820: variable 'ansible_facts' from source: unknown 11155 1726776688.19894: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093/AnsiballZ_command.py 11155 1726776688.19997: Sending initial data 11155 1726776688.20004: Sent initial data (155 bytes) 11155 1726776688.22532: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpxj1pghcw /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093/AnsiballZ_command.py <<< 11155 1726776688.23592: stderr chunk (state=3): >>><<< 11155 1726776688.23599: stdout chunk (state=3): >>><<< 11155 1726776688.23618: done transferring module to remote 11155 1726776688.23630: _low_level_execute_command(): starting 11155 1726776688.23636: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093/ /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093/AnsiballZ_command.py && sleep 0' 11155 1726776688.26001: stderr chunk (state=2): >>><<< 11155 1726776688.26009: stdout chunk (state=2): >>><<< 11155 1726776688.26026: _low_level_execute_command() done: rc=0, stdout=, stderr= 11155 1726776688.26030: _low_level_execute_command(): starting 11155 1726776688.26038: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093/AnsiballZ_command.py && sleep 0' 11155 1726776688.41745: stdout chunk (state=2): >>> {"changed": true, "stdout": "60666", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:11:28.412604", "end": "2024-09-19 16:11:28.415806", "delta": "0:00:00.003202", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11155 1726776688.42876: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11155 1726776688.42923: stderr chunk (state=3): >>><<< 11155 1726776688.42931: stdout chunk (state=3): >>><<< 11155 1726776688.42948: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60666", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:11:28.412604", "end": "2024-09-19 16:11:28.415806", "delta": "0:00:00.003202", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11155 1726776688.42991: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -x 60666 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11155 1726776688.43002: _low_level_execute_command(): starting 11155 1726776688.43008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776688.1673048-11155-134671115796093/ > /dev/null 2>&1 && sleep 0' 11155 1726776688.45449: stderr chunk (state=2): >>><<< 11155 1726776688.45458: stdout chunk (state=2): >>><<< 11155 1726776688.45473: _low_level_execute_command() done: rc=0, stdout=, stderr= 11155 1726776688.45480: handler run complete 11155 1726776688.45497: Evaluated conditional (False): False 11155 1726776688.45507: attempt loop complete, returning result 11155 1726776688.45510: _execute() done 11155 1726776688.45513: dumping result to json 11155 1726776688.45519: done dumping result, returning 11155 1726776688.45525: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [120fa90a-8a95-cec2-986e-000000000024] 11155 1726776688.45534: sending task result for task 120fa90a-8a95-cec2-986e-000000000024 11155 1726776688.45566: done sending task result for task 120fa90a-8a95-cec2-986e-000000000024 11155 1726776688.45570: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "60666", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003202", "end": "2024-09-19 16:11:28.415806", "rc": 0, "start": "2024-09-19 16:11:28.412604" } STDOUT: 60666 8218 1726776688.45786: no more pending results, returning what we have 8218 1726776688.45788: results queue empty 8218 1726776688.45789: checking for any_errors_fatal 8218 1726776688.45796: done checking for any_errors_fatal 8218 1726776688.45797: checking for max_fail_percentage 8218 1726776688.45798: done checking for max_fail_percentage 8218 1726776688.45798: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.45799: done checking to see if all hosts have failed 8218 1726776688.45799: getting the remaining hosts for this loop 8218 1726776688.45800: done getting the remaining hosts for this loop 8218 1726776688.45803: getting the next task for host managed_node2 8218 1726776688.45807: done getting next task for host managed_node2 8218 1726776688.45808: ^ task is: TASK: Apply kernel_settings for removing section 8218 1726776688.45811: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.45815: getting variables 8218 1726776688.45816: in VariableManager get_vars() 8218 1726776688.45843: Calling all_inventory to load vars for managed_node2 8218 1726776688.45845: Calling groups_inventory to load vars for managed_node2 8218 1726776688.45847: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.45854: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.45855: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.45857: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.45967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.46083: done with get_vars() 8218 1726776688.46091: done getting variables TASK [Apply kernel_settings for removing section] ****************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:180 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.329) 0:01:14.292 **** 8218 1726776688.46161: entering _queue_task() for managed_node2/include_role 8218 1726776688.46318: worker is 1 (out of 1 available) 8218 1726776688.46335: exiting _queue_task() for managed_node2/include_role 8218 1726776688.46347: done queuing things up, now waiting for results queue to drain 8218 1726776688.46348: waiting for pending results... 11163 1726776688.46470: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing section 11163 1726776688.46568: in run() - task 120fa90a-8a95-cec2-986e-000000000025 11163 1726776688.46584: variable 'ansible_search_path' from source: unknown 11163 1726776688.46612: calling self._execute() 11163 1726776688.46682: variable 'ansible_host' from source: host vars for 'managed_node2' 11163 1726776688.46691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11163 1726776688.46700: variable 'omit' from source: magic vars 11163 1726776688.46777: _execute() done 11163 1726776688.46782: dumping result to json 11163 1726776688.46787: done dumping result, returning 11163 1726776688.46792: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing section [120fa90a-8a95-cec2-986e-000000000025] 11163 1726776688.46800: sending task result for task 120fa90a-8a95-cec2-986e-000000000025 11163 1726776688.46828: done sending task result for task 120fa90a-8a95-cec2-986e-000000000025 11163 1726776688.46833: WORKER PROCESS EXITING 8218 1726776688.46938: no more pending results, returning what we have 8218 1726776688.46942: in VariableManager get_vars() 8218 1726776688.46977: Calling all_inventory to load vars for managed_node2 8218 1726776688.46980: Calling groups_inventory to load vars for managed_node2 8218 1726776688.46982: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.46990: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.46991: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.46993: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.47140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.47246: done with get_vars() 8218 1726776688.47250: variable 'ansible_search_path' from source: unknown 8218 1726776688.49216: variable 'omit' from source: magic vars 8218 1726776688.49232: variable 'omit' from source: magic vars 8218 1726776688.49241: variable 'omit' from source: magic vars 8218 1726776688.49244: we have included files to process 8218 1726776688.49244: generating all_blocks data 8218 1726776688.49245: done generating all_blocks data 8218 1726776688.49247: processing included file: fedora.linux_system_roles.kernel_settings 8218 1726776688.49261: in VariableManager get_vars() 8218 1726776688.49272: done with get_vars() 8218 1726776688.49289: in VariableManager get_vars() 8218 1726776688.49299: done with get_vars() 8218 1726776688.49324: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8218 1726776688.49363: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8218 1726776688.49377: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8218 1726776688.49420: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8218 1726776688.49899: in VariableManager get_vars() 8218 1726776688.49913: done with get_vars() 8218 1726776688.50697: in VariableManager get_vars() 8218 1726776688.50710: done with get_vars() 8218 1726776688.50807: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8218 1726776688.51175: iterating over new_blocks loaded from include file 8218 1726776688.51177: in VariableManager get_vars() 8218 1726776688.51187: done with get_vars() 8218 1726776688.51188: filtering new block on tags 8218 1726776688.51210: done filtering new block on tags 8218 1726776688.51212: in VariableManager get_vars() 8218 1726776688.51238: done with get_vars() 8218 1726776688.51240: filtering new block on tags 8218 1726776688.51263: done filtering new block on tags 8218 1726776688.51265: in VariableManager get_vars() 8218 1726776688.51273: done with get_vars() 8218 1726776688.51274: filtering new block on tags 8218 1726776688.51347: done filtering new block on tags 8218 1726776688.51349: in VariableManager get_vars() 8218 1726776688.51358: done with get_vars() 8218 1726776688.51359: filtering new block on tags 8218 1726776688.51370: done filtering new block on tags 8218 1726776688.51371: done iterating over new_blocks loaded from include file 8218 1726776688.51371: extending task lists for all hosts with included blocks 8218 1726776688.53245: done extending task lists 8218 1726776688.53246: done processing included files 8218 1726776688.53247: results queue empty 8218 1726776688.53247: checking for any_errors_fatal 8218 1726776688.53250: done checking for any_errors_fatal 8218 1726776688.53250: checking for max_fail_percentage 8218 1726776688.53251: done checking for max_fail_percentage 8218 1726776688.53251: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.53252: done checking to see if all hosts have failed 8218 1726776688.53252: getting the remaining hosts for this loop 8218 1726776688.53253: done getting the remaining hosts for this loop 8218 1726776688.53254: getting the next task for host managed_node2 8218 1726776688.53257: done getting next task for host managed_node2 8218 1726776688.53259: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8218 1726776688.53260: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.53267: getting variables 8218 1726776688.53268: in VariableManager get_vars() 8218 1726776688.53277: Calling all_inventory to load vars for managed_node2 8218 1726776688.53278: Calling groups_inventory to load vars for managed_node2 8218 1726776688.53280: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.53283: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.53285: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.53286: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.53376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.53481: done with get_vars() 8218 1726776688.53488: done getting variables 8218 1726776688.53513: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.073) 0:01:14.365 **** 8218 1726776688.53535: entering _queue_task() for managed_node2/fail 8218 1726776688.53709: worker is 1 (out of 1 available) 8218 1726776688.53723: exiting _queue_task() for managed_node2/fail 8218 1726776688.53735: done queuing things up, now waiting for results queue to drain 8218 1726776688.53737: waiting for pending results... 11164 1726776688.53867: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 11164 1726776688.53984: in run() - task 120fa90a-8a95-cec2-986e-0000000009ee 11164 1726776688.54000: variable 'ansible_search_path' from source: unknown 11164 1726776688.54004: variable 'ansible_search_path' from source: unknown 11164 1726776688.54033: calling self._execute() 11164 1726776688.54100: variable 'ansible_host' from source: host vars for 'managed_node2' 11164 1726776688.54109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11164 1726776688.54117: variable 'omit' from source: magic vars 11164 1726776688.54468: variable 'kernel_settings_sysctl' from source: include params 11164 1726776688.54482: variable '__kernel_settings_state_empty' from source: role '' all vars 11164 1726776688.54491: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): False 11164 1726776688.54495: when evaluation is False, skipping this task 11164 1726776688.54499: _execute() done 11164 1726776688.54503: dumping result to json 11164 1726776688.54506: done dumping result, returning 11164 1726776688.54511: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-cec2-986e-0000000009ee] 11164 1726776688.54517: sending task result for task 120fa90a-8a95-cec2-986e-0000000009ee 11164 1726776688.54538: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009ee 11164 1726776688.54540: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "kernel_settings_sysctl != __kernel_settings_state_empty", "skip_reason": "Conditional result was False" } 8218 1726776688.54756: no more pending results, returning what we have 8218 1726776688.54758: results queue empty 8218 1726776688.54759: checking for any_errors_fatal 8218 1726776688.54760: done checking for any_errors_fatal 8218 1726776688.54761: checking for max_fail_percentage 8218 1726776688.54762: done checking for max_fail_percentage 8218 1726776688.54762: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.54763: done checking to see if all hosts have failed 8218 1726776688.54763: getting the remaining hosts for this loop 8218 1726776688.54764: done getting the remaining hosts for this loop 8218 1726776688.54766: getting the next task for host managed_node2 8218 1726776688.54770: done getting next task for host managed_node2 8218 1726776688.54773: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8218 1726776688.54774: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.54785: getting variables 8218 1726776688.54786: in VariableManager get_vars() 8218 1726776688.54808: Calling all_inventory to load vars for managed_node2 8218 1726776688.54810: Calling groups_inventory to load vars for managed_node2 8218 1726776688.54811: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.54817: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.54819: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.54820: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.54922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.55060: done with get_vars() 8218 1726776688.55068: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.015) 0:01:14.381 **** 8218 1726776688.55131: entering _queue_task() for managed_node2/include_tasks 8218 1726776688.55279: worker is 1 (out of 1 available) 8218 1726776688.55293: exiting _queue_task() for managed_node2/include_tasks 8218 1726776688.55304: done queuing things up, now waiting for results queue to drain 8218 1726776688.55306: waiting for pending results... 11165 1726776688.55433: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 11165 1726776688.55545: in run() - task 120fa90a-8a95-cec2-986e-0000000009ef 11165 1726776688.55563: variable 'ansible_search_path' from source: unknown 11165 1726776688.55567: variable 'ansible_search_path' from source: unknown 11165 1726776688.55595: calling self._execute() 11165 1726776688.55662: variable 'ansible_host' from source: host vars for 'managed_node2' 11165 1726776688.55671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11165 1726776688.55680: variable 'omit' from source: magic vars 11165 1726776688.55754: _execute() done 11165 1726776688.55762: dumping result to json 11165 1726776688.55768: done dumping result, returning 11165 1726776688.55775: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-cec2-986e-0000000009ef] 11165 1726776688.55782: sending task result for task 120fa90a-8a95-cec2-986e-0000000009ef 11165 1726776688.55806: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009ef 11165 1726776688.55810: WORKER PROCESS EXITING 8218 1726776688.55910: no more pending results, returning what we have 8218 1726776688.55914: in VariableManager get_vars() 8218 1726776688.55948: Calling all_inventory to load vars for managed_node2 8218 1726776688.55951: Calling groups_inventory to load vars for managed_node2 8218 1726776688.55953: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.55961: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.55963: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.55965: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.56068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.56178: done with get_vars() 8218 1726776688.56184: variable 'ansible_search_path' from source: unknown 8218 1726776688.56184: variable 'ansible_search_path' from source: unknown 8218 1726776688.56207: we have included files to process 8218 1726776688.56207: generating all_blocks data 8218 1726776688.56208: done generating all_blocks data 8218 1726776688.56216: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776688.56216: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776688.56218: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8218 1726776688.56650: done processing included file 8218 1726776688.56652: iterating over new_blocks loaded from include file 8218 1726776688.56653: in VariableManager get_vars() 8218 1726776688.56668: done with get_vars() 8218 1726776688.56669: filtering new block on tags 8218 1726776688.56687: done filtering new block on tags 8218 1726776688.56707: in VariableManager get_vars() 8218 1726776688.56721: done with get_vars() 8218 1726776688.56722: filtering new block on tags 8218 1726776688.56745: done filtering new block on tags 8218 1726776688.56747: in VariableManager get_vars() 8218 1726776688.56760: done with get_vars() 8218 1726776688.56761: filtering new block on tags 8218 1726776688.56783: done filtering new block on tags 8218 1726776688.56784: in VariableManager get_vars() 8218 1726776688.56797: done with get_vars() 8218 1726776688.56798: filtering new block on tags 8218 1726776688.56811: done filtering new block on tags 8218 1726776688.56812: done iterating over new_blocks loaded from include file 8218 1726776688.56813: extending task lists for all hosts with included blocks 8218 1726776688.56900: done extending task lists 8218 1726776688.56901: done processing included files 8218 1726776688.56901: results queue empty 8218 1726776688.56902: checking for any_errors_fatal 8218 1726776688.56905: done checking for any_errors_fatal 8218 1726776688.56905: checking for max_fail_percentage 8218 1726776688.56906: done checking for max_fail_percentage 8218 1726776688.56907: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.56907: done checking to see if all hosts have failed 8218 1726776688.56908: getting the remaining hosts for this loop 8218 1726776688.56908: done getting the remaining hosts for this loop 8218 1726776688.56910: getting the next task for host managed_node2 8218 1726776688.56912: done getting next task for host managed_node2 8218 1726776688.56914: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8218 1726776688.56916: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.56922: getting variables 8218 1726776688.56923: in VariableManager get_vars() 8218 1726776688.56933: Calling all_inventory to load vars for managed_node2 8218 1726776688.56934: Calling groups_inventory to load vars for managed_node2 8218 1726776688.56935: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.56938: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.56939: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.56941: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.57014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.57120: done with get_vars() 8218 1726776688.57127: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.020) 0:01:14.402 **** 8218 1726776688.57173: entering _queue_task() for managed_node2/setup 8218 1726776688.57324: worker is 1 (out of 1 available) 8218 1726776688.57339: exiting _queue_task() for managed_node2/setup 8218 1726776688.57350: done queuing things up, now waiting for results queue to drain 8218 1726776688.57352: waiting for pending results... 11166 1726776688.57478: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 11166 1726776688.57604: in run() - task 120fa90a-8a95-cec2-986e-000000000bd2 11166 1726776688.57620: variable 'ansible_search_path' from source: unknown 11166 1726776688.57624: variable 'ansible_search_path' from source: unknown 11166 1726776688.57652: calling self._execute() 11166 1726776688.57715: variable 'ansible_host' from source: host vars for 'managed_node2' 11166 1726776688.57780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11166 1726776688.57790: variable 'omit' from source: magic vars 11166 1726776688.58150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11166 1726776688.59666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11166 1726776688.59719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11166 1726776688.59749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11166 1726776688.59778: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11166 1726776688.59799: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11166 1726776688.59855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11166 1726776688.59878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11166 1726776688.59896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11166 1726776688.59923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11166 1726776688.59936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11166 1726776688.59977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11166 1726776688.59995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11166 1726776688.60012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11166 1726776688.60039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11166 1726776688.60051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11166 1726776688.60171: variable '__kernel_settings_required_facts' from source: role '' all vars 11166 1726776688.60182: variable 'ansible_facts' from source: unknown 11166 1726776688.60239: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11166 1726776688.60245: when evaluation is False, skipping this task 11166 1726776688.60249: _execute() done 11166 1726776688.60252: dumping result to json 11166 1726776688.60256: done dumping result, returning 11166 1726776688.60265: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-cec2-986e-000000000bd2] 11166 1726776688.60271: sending task result for task 120fa90a-8a95-cec2-986e-000000000bd2 11166 1726776688.60294: done sending task result for task 120fa90a-8a95-cec2-986e-000000000bd2 11166 1726776688.60298: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8218 1726776688.60422: no more pending results, returning what we have 8218 1726776688.60425: results queue empty 8218 1726776688.60425: checking for any_errors_fatal 8218 1726776688.60427: done checking for any_errors_fatal 8218 1726776688.60428: checking for max_fail_percentage 8218 1726776688.60430: done checking for max_fail_percentage 8218 1726776688.60431: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.60432: done checking to see if all hosts have failed 8218 1726776688.60432: getting the remaining hosts for this loop 8218 1726776688.60434: done getting the remaining hosts for this loop 8218 1726776688.60437: getting the next task for host managed_node2 8218 1726776688.60445: done getting next task for host managed_node2 8218 1726776688.60448: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8218 1726776688.60451: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.60467: getting variables 8218 1726776688.60469: in VariableManager get_vars() 8218 1726776688.60549: Calling all_inventory to load vars for managed_node2 8218 1726776688.60551: Calling groups_inventory to load vars for managed_node2 8218 1726776688.60553: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.60559: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.60561: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.60563: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.60666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.60791: done with get_vars() 8218 1726776688.60800: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.036) 0:01:14.439 **** 8218 1726776688.60868: entering _queue_task() for managed_node2/stat 8218 1726776688.61023: worker is 1 (out of 1 available) 8218 1726776688.61039: exiting _queue_task() for managed_node2/stat 8218 1726776688.61051: done queuing things up, now waiting for results queue to drain 8218 1726776688.61053: waiting for pending results... 11167 1726776688.61180: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 11167 1726776688.61306: in run() - task 120fa90a-8a95-cec2-986e-000000000bd4 11167 1726776688.61322: variable 'ansible_search_path' from source: unknown 11167 1726776688.61326: variable 'ansible_search_path' from source: unknown 11167 1726776688.61353: calling self._execute() 11167 1726776688.61418: variable 'ansible_host' from source: host vars for 'managed_node2' 11167 1726776688.61427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11167 1726776688.61438: variable 'omit' from source: magic vars 11167 1726776688.61763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11167 1726776688.61940: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11167 1726776688.61997: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11167 1726776688.62023: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11167 1726776688.62050: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11167 1726776688.62108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11167 1726776688.62127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11167 1726776688.62150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11167 1726776688.62170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11167 1726776688.62254: variable '__kernel_settings_is_ostree' from source: set_fact 11167 1726776688.62267: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11167 1726776688.62271: when evaluation is False, skipping this task 11167 1726776688.62275: _execute() done 11167 1726776688.62278: dumping result to json 11167 1726776688.62282: done dumping result, returning 11167 1726776688.62287: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-cec2-986e-000000000bd4] 11167 1726776688.62294: sending task result for task 120fa90a-8a95-cec2-986e-000000000bd4 11167 1726776688.62315: done sending task result for task 120fa90a-8a95-cec2-986e-000000000bd4 11167 1726776688.62319: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776688.62421: no more pending results, returning what we have 8218 1726776688.62423: results queue empty 8218 1726776688.62424: checking for any_errors_fatal 8218 1726776688.62431: done checking for any_errors_fatal 8218 1726776688.62432: checking for max_fail_percentage 8218 1726776688.62433: done checking for max_fail_percentage 8218 1726776688.62434: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.62435: done checking to see if all hosts have failed 8218 1726776688.62435: getting the remaining hosts for this loop 8218 1726776688.62437: done getting the remaining hosts for this loop 8218 1726776688.62439: getting the next task for host managed_node2 8218 1726776688.62445: done getting next task for host managed_node2 8218 1726776688.62448: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8218 1726776688.62451: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.62465: getting variables 8218 1726776688.62466: in VariableManager get_vars() 8218 1726776688.62495: Calling all_inventory to load vars for managed_node2 8218 1726776688.62497: Calling groups_inventory to load vars for managed_node2 8218 1726776688.62499: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.62507: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.62509: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.62511: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.62611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.62749: done with get_vars() 8218 1726776688.62756: done getting variables 8218 1726776688.62794: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.019) 0:01:14.458 **** 8218 1726776688.62817: entering _queue_task() for managed_node2/set_fact 8218 1726776688.62959: worker is 1 (out of 1 available) 8218 1726776688.62972: exiting _queue_task() for managed_node2/set_fact 8218 1726776688.62984: done queuing things up, now waiting for results queue to drain 8218 1726776688.62985: waiting for pending results... 11168 1726776688.63112: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 11168 1726776688.63233: in run() - task 120fa90a-8a95-cec2-986e-000000000bd5 11168 1726776688.63250: variable 'ansible_search_path' from source: unknown 11168 1726776688.63254: variable 'ansible_search_path' from source: unknown 11168 1726776688.63281: calling self._execute() 11168 1726776688.63342: variable 'ansible_host' from source: host vars for 'managed_node2' 11168 1726776688.63352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11168 1726776688.63362: variable 'omit' from source: magic vars 11168 1726776688.63683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11168 1726776688.63850: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11168 1726776688.63887: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11168 1726776688.63911: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11168 1726776688.63940: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11168 1726776688.63996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11168 1726776688.64016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11168 1726776688.64036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11168 1726776688.64054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11168 1726776688.64139: variable '__kernel_settings_is_ostree' from source: set_fact 11168 1726776688.64150: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11168 1726776688.64154: when evaluation is False, skipping this task 11168 1726776688.64160: _execute() done 11168 1726776688.64164: dumping result to json 11168 1726776688.64168: done dumping result, returning 11168 1726776688.64174: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-cec2-986e-000000000bd5] 11168 1726776688.64180: sending task result for task 120fa90a-8a95-cec2-986e-000000000bd5 11168 1726776688.64202: done sending task result for task 120fa90a-8a95-cec2-986e-000000000bd5 11168 1726776688.64206: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776688.64326: no more pending results, returning what we have 8218 1726776688.64331: results queue empty 8218 1726776688.64332: checking for any_errors_fatal 8218 1726776688.64340: done checking for any_errors_fatal 8218 1726776688.64340: checking for max_fail_percentage 8218 1726776688.64341: done checking for max_fail_percentage 8218 1726776688.64342: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.64343: done checking to see if all hosts have failed 8218 1726776688.64343: getting the remaining hosts for this loop 8218 1726776688.64345: done getting the remaining hosts for this loop 8218 1726776688.64348: getting the next task for host managed_node2 8218 1726776688.64355: done getting next task for host managed_node2 8218 1726776688.64358: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8218 1726776688.64361: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.64374: getting variables 8218 1726776688.64374: in VariableManager get_vars() 8218 1726776688.64398: Calling all_inventory to load vars for managed_node2 8218 1726776688.64399: Calling groups_inventory to load vars for managed_node2 8218 1726776688.64401: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.64406: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.64408: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.64409: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.64508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.64625: done with get_vars() 8218 1726776688.64635: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.018) 0:01:14.477 **** 8218 1726776688.64696: entering _queue_task() for managed_node2/stat 8218 1726776688.64843: worker is 1 (out of 1 available) 8218 1726776688.64857: exiting _queue_task() for managed_node2/stat 8218 1726776688.64868: done queuing things up, now waiting for results queue to drain 8218 1726776688.64870: waiting for pending results... 11169 1726776688.64991: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 11169 1726776688.65117: in run() - task 120fa90a-8a95-cec2-986e-000000000bd7 11169 1726776688.65134: variable 'ansible_search_path' from source: unknown 11169 1726776688.65138: variable 'ansible_search_path' from source: unknown 11169 1726776688.65166: calling self._execute() 11169 1726776688.65227: variable 'ansible_host' from source: host vars for 'managed_node2' 11169 1726776688.65237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11169 1726776688.65246: variable 'omit' from source: magic vars 11169 1726776688.65569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11169 1726776688.65786: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11169 1726776688.65821: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11169 1726776688.65848: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11169 1726776688.65875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11169 1726776688.65932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11169 1726776688.65951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11169 1726776688.65972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11169 1726776688.65991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11169 1726776688.66074: variable '__kernel_settings_is_transactional' from source: set_fact 11169 1726776688.66085: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11169 1726776688.66089: when evaluation is False, skipping this task 11169 1726776688.66093: _execute() done 11169 1726776688.66096: dumping result to json 11169 1726776688.66100: done dumping result, returning 11169 1726776688.66106: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-cec2-986e-000000000bd7] 11169 1726776688.66112: sending task result for task 120fa90a-8a95-cec2-986e-000000000bd7 11169 1726776688.66138: done sending task result for task 120fa90a-8a95-cec2-986e-000000000bd7 11169 1726776688.66142: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776688.66241: no more pending results, returning what we have 8218 1726776688.66244: results queue empty 8218 1726776688.66244: checking for any_errors_fatal 8218 1726776688.66249: done checking for any_errors_fatal 8218 1726776688.66249: checking for max_fail_percentage 8218 1726776688.66251: done checking for max_fail_percentage 8218 1726776688.66251: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.66252: done checking to see if all hosts have failed 8218 1726776688.66253: getting the remaining hosts for this loop 8218 1726776688.66254: done getting the remaining hosts for this loop 8218 1726776688.66257: getting the next task for host managed_node2 8218 1726776688.66265: done getting next task for host managed_node2 8218 1726776688.66268: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8218 1726776688.66271: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.66286: getting variables 8218 1726776688.66287: in VariableManager get_vars() 8218 1726776688.66316: Calling all_inventory to load vars for managed_node2 8218 1726776688.66318: Calling groups_inventory to load vars for managed_node2 8218 1726776688.66320: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.66327: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.66330: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.66333: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.66473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.66587: done with get_vars() 8218 1726776688.66593: done getting variables 8218 1726776688.66634: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.019) 0:01:14.497 **** 8218 1726776688.66656: entering _queue_task() for managed_node2/set_fact 8218 1726776688.66801: worker is 1 (out of 1 available) 8218 1726776688.66814: exiting _queue_task() for managed_node2/set_fact 8218 1726776688.66826: done queuing things up, now waiting for results queue to drain 8218 1726776688.66827: waiting for pending results... 11170 1726776688.66946: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 11170 1726776688.67066: in run() - task 120fa90a-8a95-cec2-986e-000000000bd8 11170 1726776688.67081: variable 'ansible_search_path' from source: unknown 11170 1726776688.67085: variable 'ansible_search_path' from source: unknown 11170 1726776688.67110: calling self._execute() 11170 1726776688.67172: variable 'ansible_host' from source: host vars for 'managed_node2' 11170 1726776688.67181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11170 1726776688.67190: variable 'omit' from source: magic vars 11170 1726776688.67505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11170 1726776688.67673: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11170 1726776688.67707: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11170 1726776688.67734: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11170 1726776688.67760: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11170 1726776688.67817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11170 1726776688.67839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11170 1726776688.67858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11170 1726776688.67876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11170 1726776688.67957: variable '__kernel_settings_is_transactional' from source: set_fact 11170 1726776688.67969: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11170 1726776688.67973: when evaluation is False, skipping this task 11170 1726776688.67977: _execute() done 11170 1726776688.67980: dumping result to json 11170 1726776688.67984: done dumping result, returning 11170 1726776688.67990: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-cec2-986e-000000000bd8] 11170 1726776688.67996: sending task result for task 120fa90a-8a95-cec2-986e-000000000bd8 11170 1726776688.68017: done sending task result for task 120fa90a-8a95-cec2-986e-000000000bd8 11170 1726776688.68021: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776688.68111: no more pending results, returning what we have 8218 1726776688.68114: results queue empty 8218 1726776688.68115: checking for any_errors_fatal 8218 1726776688.68120: done checking for any_errors_fatal 8218 1726776688.68121: checking for max_fail_percentage 8218 1726776688.68122: done checking for max_fail_percentage 8218 1726776688.68123: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.68124: done checking to see if all hosts have failed 8218 1726776688.68124: getting the remaining hosts for this loop 8218 1726776688.68126: done getting the remaining hosts for this loop 8218 1726776688.68130: getting the next task for host managed_node2 8218 1726776688.68137: done getting next task for host managed_node2 8218 1726776688.68141: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8218 1726776688.68144: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.68160: getting variables 8218 1726776688.68162: in VariableManager get_vars() 8218 1726776688.68190: Calling all_inventory to load vars for managed_node2 8218 1726776688.68192: Calling groups_inventory to load vars for managed_node2 8218 1726776688.68194: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.68200: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.68202: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.68204: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.68305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.68422: done with get_vars() 8218 1726776688.68431: done getting variables 8218 1726776688.68470: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.018) 0:01:14.515 **** 8218 1726776688.68493: entering _queue_task() for managed_node2/include_vars 8218 1726776688.68637: worker is 1 (out of 1 available) 8218 1726776688.68648: exiting _queue_task() for managed_node2/include_vars 8218 1726776688.68662: done queuing things up, now waiting for results queue to drain 8218 1726776688.68664: waiting for pending results... 11171 1726776688.68777: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 11171 1726776688.68895: in run() - task 120fa90a-8a95-cec2-986e-000000000bda 11171 1726776688.68910: variable 'ansible_search_path' from source: unknown 11171 1726776688.68914: variable 'ansible_search_path' from source: unknown 11171 1726776688.68940: calling self._execute() 11171 1726776688.69003: variable 'ansible_host' from source: host vars for 'managed_node2' 11171 1726776688.69012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11171 1726776688.69021: variable 'omit' from source: magic vars 11171 1726776688.69094: variable 'omit' from source: magic vars 11171 1726776688.69142: variable 'omit' from source: magic vars 11171 1726776688.69389: variable 'ffparams' from source: task vars 11171 1726776688.69527: variable 'ansible_facts' from source: unknown 11171 1726776688.69648: variable 'ansible_facts' from source: unknown 11171 1726776688.69735: variable 'ansible_facts' from source: unknown 11171 1726776688.69820: variable 'ansible_facts' from source: unknown 11171 1726776688.69896: variable 'role_path' from source: magic vars 11171 1726776688.70008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11171 1726776688.70143: Loaded config def from plugin (lookup/first_found) 11171 1726776688.70151: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 11171 1726776688.70178: variable 'ansible_search_path' from source: unknown 11171 1726776688.70196: variable 'ansible_search_path' from source: unknown 11171 1726776688.70204: variable 'ansible_search_path' from source: unknown 11171 1726776688.70211: variable 'ansible_search_path' from source: unknown 11171 1726776688.70219: variable 'ansible_search_path' from source: unknown 11171 1726776688.70234: variable 'omit' from source: magic vars 11171 1726776688.70252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11171 1726776688.70270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11171 1726776688.70285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11171 1726776688.70298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11171 1726776688.70308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11171 1726776688.70328: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11171 1726776688.70334: variable 'ansible_host' from source: host vars for 'managed_node2' 11171 1726776688.70338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11171 1726776688.70396: Set connection var ansible_connection to ssh 11171 1726776688.70403: Set connection var ansible_pipelining to False 11171 1726776688.70410: Set connection var ansible_timeout to 10 11171 1726776688.70415: Set connection var ansible_module_compression to ZIP_DEFLATED 11171 1726776688.70418: Set connection var ansible_shell_type to sh 11171 1726776688.70421: Set connection var ansible_shell_executable to /bin/sh 11171 1726776688.70442: variable 'ansible_shell_executable' from source: unknown 11171 1726776688.70447: variable 'ansible_connection' from source: unknown 11171 1726776688.70450: variable 'ansible_module_compression' from source: unknown 11171 1726776688.70453: variable 'ansible_shell_type' from source: unknown 11171 1726776688.70457: variable 'ansible_shell_executable' from source: unknown 11171 1726776688.70460: variable 'ansible_host' from source: host vars for 'managed_node2' 11171 1726776688.70464: variable 'ansible_pipelining' from source: unknown 11171 1726776688.70467: variable 'ansible_timeout' from source: unknown 11171 1726776688.70471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11171 1726776688.70537: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11171 1726776688.70548: variable 'omit' from source: magic vars 11171 1726776688.70554: starting attempt loop 11171 1726776688.70557: running the handler 11171 1726776688.70597: handler run complete 11171 1726776688.70607: attempt loop complete, returning result 11171 1726776688.70611: _execute() done 11171 1726776688.70614: dumping result to json 11171 1726776688.70619: done dumping result, returning 11171 1726776688.70625: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-cec2-986e-000000000bda] 11171 1726776688.70633: sending task result for task 120fa90a-8a95-cec2-986e-000000000bda 11171 1726776688.70657: done sending task result for task 120fa90a-8a95-cec2-986e-000000000bda 11171 1726776688.70660: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8218 1726776688.70790: no more pending results, returning what we have 8218 1726776688.70792: results queue empty 8218 1726776688.70793: checking for any_errors_fatal 8218 1726776688.70796: done checking for any_errors_fatal 8218 1726776688.70797: checking for max_fail_percentage 8218 1726776688.70798: done checking for max_fail_percentage 8218 1726776688.70799: checking to see if all hosts have failed and the running result is not ok 8218 1726776688.70799: done checking to see if all hosts have failed 8218 1726776688.70800: getting the remaining hosts for this loop 8218 1726776688.70801: done getting the remaining hosts for this loop 8218 1726776688.70803: getting the next task for host managed_node2 8218 1726776688.70811: done getting next task for host managed_node2 8218 1726776688.70814: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8218 1726776688.70816: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776688.70826: getting variables 8218 1726776688.70827: in VariableManager get_vars() 8218 1726776688.70855: Calling all_inventory to load vars for managed_node2 8218 1726776688.70857: Calling groups_inventory to load vars for managed_node2 8218 1726776688.70860: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776688.70866: Calling all_plugins_play to load vars for managed_node2 8218 1726776688.70868: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776688.70869: Calling groups_plugins_play to load vars for managed_node2 8218 1726776688.71003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776688.71116: done with get_vars() 8218 1726776688.71122: done getting variables 8218 1726776688.71163: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:11:28 -0400 (0:00:00.026) 0:01:14.542 **** 8218 1726776688.71184: entering _queue_task() for managed_node2/package 8218 1726776688.71321: worker is 1 (out of 1 available) 8218 1726776688.71337: exiting _queue_task() for managed_node2/package 8218 1726776688.71348: done queuing things up, now waiting for results queue to drain 8218 1726776688.71350: waiting for pending results... 11172 1726776688.71468: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 11172 1726776688.71578: in run() - task 120fa90a-8a95-cec2-986e-0000000009f0 11172 1726776688.71592: variable 'ansible_search_path' from source: unknown 11172 1726776688.71595: variable 'ansible_search_path' from source: unknown 11172 1726776688.71620: calling self._execute() 11172 1726776688.71682: variable 'ansible_host' from source: host vars for 'managed_node2' 11172 1726776688.71690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11172 1726776688.71699: variable 'omit' from source: magic vars 11172 1726776688.71770: variable 'omit' from source: magic vars 11172 1726776688.71803: variable 'omit' from source: magic vars 11172 1726776688.71824: variable '__kernel_settings_packages' from source: include_vars 11172 1726776688.72020: variable '__kernel_settings_packages' from source: include_vars 11172 1726776688.72168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11172 1726776688.73808: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11172 1726776688.73853: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11172 1726776688.73881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11172 1726776688.73917: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11172 1726776688.73939: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11172 1726776688.74002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11172 1726776688.74021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11172 1726776688.74042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11172 1726776688.74070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11172 1726776688.74082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11172 1726776688.74147: variable '__kernel_settings_is_ostree' from source: set_fact 11172 1726776688.74154: variable 'omit' from source: magic vars 11172 1726776688.74176: variable 'omit' from source: magic vars 11172 1726776688.74196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11172 1726776688.74217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11172 1726776688.74233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11172 1726776688.74246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11172 1726776688.74255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11172 1726776688.74277: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11172 1726776688.74282: variable 'ansible_host' from source: host vars for 'managed_node2' 11172 1726776688.74286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11172 1726776688.74350: Set connection var ansible_connection to ssh 11172 1726776688.74358: Set connection var ansible_pipelining to False 11172 1726776688.74365: Set connection var ansible_timeout to 10 11172 1726776688.74372: Set connection var ansible_module_compression to ZIP_DEFLATED 11172 1726776688.74378: Set connection var ansible_shell_type to sh 11172 1726776688.74383: Set connection var ansible_shell_executable to /bin/sh 11172 1726776688.74399: variable 'ansible_shell_executable' from source: unknown 11172 1726776688.74403: variable 'ansible_connection' from source: unknown 11172 1726776688.74406: variable 'ansible_module_compression' from source: unknown 11172 1726776688.74409: variable 'ansible_shell_type' from source: unknown 11172 1726776688.74413: variable 'ansible_shell_executable' from source: unknown 11172 1726776688.74416: variable 'ansible_host' from source: host vars for 'managed_node2' 11172 1726776688.74420: variable 'ansible_pipelining' from source: unknown 11172 1726776688.74423: variable 'ansible_timeout' from source: unknown 11172 1726776688.74428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11172 1726776688.74488: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11172 1726776688.74500: variable 'omit' from source: magic vars 11172 1726776688.74505: starting attempt loop 11172 1726776688.74508: running the handler 11172 1726776688.74567: variable 'ansible_facts' from source: unknown 11172 1726776688.74641: _low_level_execute_command(): starting 11172 1726776688.74650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11172 1726776688.76946: stdout chunk (state=2): >>>/root <<< 11172 1726776688.77064: stderr chunk (state=3): >>><<< 11172 1726776688.77072: stdout chunk (state=3): >>><<< 11172 1726776688.77088: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11172 1726776688.77099: _low_level_execute_command(): starting 11172 1726776688.77104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892 `" && echo ansible-tmp-1726776688.770955-11172-51748142496892="` echo /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892 `" ) && sleep 0' 11172 1726776688.79640: stdout chunk (state=2): >>>ansible-tmp-1726776688.770955-11172-51748142496892=/root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892 <<< 11172 1726776688.79766: stderr chunk (state=3): >>><<< 11172 1726776688.79772: stdout chunk (state=3): >>><<< 11172 1726776688.79785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776688.770955-11172-51748142496892=/root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892 , stderr= 11172 1726776688.79806: variable 'ansible_module_compression' from source: unknown 11172 1726776688.79850: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11172 1726776688.79883: variable 'ansible_facts' from source: unknown 11172 1726776688.79972: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892/AnsiballZ_dnf.py 11172 1726776688.80064: Sending initial data 11172 1726776688.80071: Sent initial data (149 bytes) 11172 1726776688.82558: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpihdflgky /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892/AnsiballZ_dnf.py <<< 11172 1726776688.83896: stderr chunk (state=3): >>><<< 11172 1726776688.83902: stdout chunk (state=3): >>><<< 11172 1726776688.83919: done transferring module to remote 11172 1726776688.83931: _low_level_execute_command(): starting 11172 1726776688.83936: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892/ /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892/AnsiballZ_dnf.py && sleep 0' 11172 1726776688.86336: stderr chunk (state=2): >>><<< 11172 1726776688.86345: stdout chunk (state=2): >>><<< 11172 1726776688.86359: _low_level_execute_command() done: rc=0, stdout=, stderr= 11172 1726776688.86364: _low_level_execute_command(): starting 11172 1726776688.86369: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892/AnsiballZ_dnf.py && sleep 0' 11172 1726776691.40768: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 11172 1726776691.48813: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11172 1726776691.48864: stderr chunk (state=3): >>><<< 11172 1726776691.48872: stdout chunk (state=3): >>><<< 11172 1726776691.48888: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11172 1726776691.48921: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11172 1726776691.48930: _low_level_execute_command(): starting 11172 1726776691.48936: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776688.770955-11172-51748142496892/ > /dev/null 2>&1 && sleep 0' 11172 1726776691.51397: stderr chunk (state=2): >>><<< 11172 1726776691.51406: stdout chunk (state=2): >>><<< 11172 1726776691.51421: _low_level_execute_command() done: rc=0, stdout=, stderr= 11172 1726776691.51430: handler run complete 11172 1726776691.51458: attempt loop complete, returning result 11172 1726776691.51463: _execute() done 11172 1726776691.51466: dumping result to json 11172 1726776691.51473: done dumping result, returning 11172 1726776691.51480: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-cec2-986e-0000000009f0] 11172 1726776691.51486: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f0 11172 1726776691.51515: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f0 11172 1726776691.51518: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8218 1726776691.51682: no more pending results, returning what we have 8218 1726776691.51685: results queue empty 8218 1726776691.51686: checking for any_errors_fatal 8218 1726776691.51692: done checking for any_errors_fatal 8218 1726776691.51693: checking for max_fail_percentage 8218 1726776691.51694: done checking for max_fail_percentage 8218 1726776691.51695: checking to see if all hosts have failed and the running result is not ok 8218 1726776691.51695: done checking to see if all hosts have failed 8218 1726776691.51696: getting the remaining hosts for this loop 8218 1726776691.51697: done getting the remaining hosts for this loop 8218 1726776691.51700: getting the next task for host managed_node2 8218 1726776691.51707: done getting next task for host managed_node2 8218 1726776691.51710: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8218 1726776691.51713: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776691.51722: getting variables 8218 1726776691.51723: in VariableManager get_vars() 8218 1726776691.51758: Calling all_inventory to load vars for managed_node2 8218 1726776691.51763: Calling groups_inventory to load vars for managed_node2 8218 1726776691.51765: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776691.51773: Calling all_plugins_play to load vars for managed_node2 8218 1726776691.51774: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776691.51776: Calling groups_plugins_play to load vars for managed_node2 8218 1726776691.51885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776691.52006: done with get_vars() 8218 1726776691.52015: done getting variables 8218 1726776691.52060: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:11:31 -0400 (0:00:02.809) 0:01:17.351 **** 8218 1726776691.52087: entering _queue_task() for managed_node2/debug 8218 1726776691.52255: worker is 1 (out of 1 available) 8218 1726776691.52272: exiting _queue_task() for managed_node2/debug 8218 1726776691.52284: done queuing things up, now waiting for results queue to drain 8218 1726776691.52286: waiting for pending results... 11189 1726776691.52409: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 11189 1726776691.52524: in run() - task 120fa90a-8a95-cec2-986e-0000000009f2 11189 1726776691.52541: variable 'ansible_search_path' from source: unknown 11189 1726776691.52545: variable 'ansible_search_path' from source: unknown 11189 1726776691.52573: calling self._execute() 11189 1726776691.52640: variable 'ansible_host' from source: host vars for 'managed_node2' 11189 1726776691.52650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11189 1726776691.52659: variable 'omit' from source: magic vars 11189 1726776691.52990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11189 1726776691.54708: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11189 1726776691.54753: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11189 1726776691.54785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11189 1726776691.54810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11189 1726776691.54832: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11189 1726776691.54886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11189 1726776691.54908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11189 1726776691.54926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11189 1726776691.54954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11189 1726776691.54968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11189 1726776691.55042: variable '__kernel_settings_is_transactional' from source: set_fact 11189 1726776691.55059: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11189 1726776691.55065: when evaluation is False, skipping this task 11189 1726776691.55069: _execute() done 11189 1726776691.55073: dumping result to json 11189 1726776691.55077: done dumping result, returning 11189 1726776691.55083: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-0000000009f2] 11189 1726776691.55089: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f2 11189 1726776691.55111: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f2 11189 1726776691.55115: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8218 1726776691.55243: no more pending results, returning what we have 8218 1726776691.55246: results queue empty 8218 1726776691.55247: checking for any_errors_fatal 8218 1726776691.55253: done checking for any_errors_fatal 8218 1726776691.55254: checking for max_fail_percentage 8218 1726776691.55255: done checking for max_fail_percentage 8218 1726776691.55256: checking to see if all hosts have failed and the running result is not ok 8218 1726776691.55256: done checking to see if all hosts have failed 8218 1726776691.55257: getting the remaining hosts for this loop 8218 1726776691.55258: done getting the remaining hosts for this loop 8218 1726776691.55261: getting the next task for host managed_node2 8218 1726776691.55266: done getting next task for host managed_node2 8218 1726776691.55270: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8218 1726776691.55272: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776691.55287: getting variables 8218 1726776691.55288: in VariableManager get_vars() 8218 1726776691.55318: Calling all_inventory to load vars for managed_node2 8218 1726776691.55321: Calling groups_inventory to load vars for managed_node2 8218 1726776691.55323: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776691.55332: Calling all_plugins_play to load vars for managed_node2 8218 1726776691.55333: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776691.55335: Calling groups_plugins_play to load vars for managed_node2 8218 1726776691.55442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776691.55595: done with get_vars() 8218 1726776691.55602: done getting variables 8218 1726776691.55643: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:11:31 -0400 (0:00:00.035) 0:01:17.387 **** 8218 1726776691.55666: entering _queue_task() for managed_node2/reboot 8218 1726776691.55818: worker is 1 (out of 1 available) 8218 1726776691.55834: exiting _queue_task() for managed_node2/reboot 8218 1726776691.55845: done queuing things up, now waiting for results queue to drain 8218 1726776691.55847: waiting for pending results... 11190 1726776691.55972: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 11190 1726776691.56083: in run() - task 120fa90a-8a95-cec2-986e-0000000009f3 11190 1726776691.56098: variable 'ansible_search_path' from source: unknown 11190 1726776691.56102: variable 'ansible_search_path' from source: unknown 11190 1726776691.56130: calling self._execute() 11190 1726776691.56195: variable 'ansible_host' from source: host vars for 'managed_node2' 11190 1726776691.56203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11190 1726776691.56212: variable 'omit' from source: magic vars 11190 1726776691.56545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11190 1726776691.58218: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11190 1726776691.58266: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11190 1726776691.58303: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11190 1726776691.58330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11190 1726776691.58352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11190 1726776691.58405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11190 1726776691.58427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11190 1726776691.58447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11190 1726776691.58477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11190 1726776691.58488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11190 1726776691.58559: variable '__kernel_settings_is_transactional' from source: set_fact 11190 1726776691.58577: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11190 1726776691.58581: when evaluation is False, skipping this task 11190 1726776691.58585: _execute() done 11190 1726776691.58589: dumping result to json 11190 1726776691.58592: done dumping result, returning 11190 1726776691.58598: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-cec2-986e-0000000009f3] 11190 1726776691.58604: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f3 11190 1726776691.58626: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f3 11190 1726776691.58631: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776691.58732: no more pending results, returning what we have 8218 1726776691.58735: results queue empty 8218 1726776691.58736: checking for any_errors_fatal 8218 1726776691.58741: done checking for any_errors_fatal 8218 1726776691.58742: checking for max_fail_percentage 8218 1726776691.58743: done checking for max_fail_percentage 8218 1726776691.58744: checking to see if all hosts have failed and the running result is not ok 8218 1726776691.58745: done checking to see if all hosts have failed 8218 1726776691.58745: getting the remaining hosts for this loop 8218 1726776691.58746: done getting the remaining hosts for this loop 8218 1726776691.58749: getting the next task for host managed_node2 8218 1726776691.58754: done getting next task for host managed_node2 8218 1726776691.58757: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8218 1726776691.58759: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776691.58775: getting variables 8218 1726776691.58776: in VariableManager get_vars() 8218 1726776691.58806: Calling all_inventory to load vars for managed_node2 8218 1726776691.58809: Calling groups_inventory to load vars for managed_node2 8218 1726776691.58811: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776691.58818: Calling all_plugins_play to load vars for managed_node2 8218 1726776691.58821: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776691.58823: Calling groups_plugins_play to load vars for managed_node2 8218 1726776691.58939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776691.59055: done with get_vars() 8218 1726776691.59063: done getting variables 8218 1726776691.59103: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:11:31 -0400 (0:00:00.034) 0:01:17.421 **** 8218 1726776691.59124: entering _queue_task() for managed_node2/fail 8218 1726776691.59280: worker is 1 (out of 1 available) 8218 1726776691.59294: exiting _queue_task() for managed_node2/fail 8218 1726776691.59305: done queuing things up, now waiting for results queue to drain 8218 1726776691.59306: waiting for pending results... 11191 1726776691.59432: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 11191 1726776691.59547: in run() - task 120fa90a-8a95-cec2-986e-0000000009f4 11191 1726776691.59563: variable 'ansible_search_path' from source: unknown 11191 1726776691.59568: variable 'ansible_search_path' from source: unknown 11191 1726776691.59594: calling self._execute() 11191 1726776691.59655: variable 'ansible_host' from source: host vars for 'managed_node2' 11191 1726776691.59666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11191 1726776691.59674: variable 'omit' from source: magic vars 11191 1726776691.60005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11191 1726776691.61704: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11191 1726776691.61748: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11191 1726776691.61777: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11191 1726776691.61804: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11191 1726776691.61824: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11191 1726776691.61879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11191 1726776691.61910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11191 1726776691.61932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11191 1726776691.61958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11191 1726776691.61972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11191 1726776691.62042: variable '__kernel_settings_is_transactional' from source: set_fact 11191 1726776691.62058: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11191 1726776691.62064: when evaluation is False, skipping this task 11191 1726776691.62069: _execute() done 11191 1726776691.62072: dumping result to json 11191 1726776691.62076: done dumping result, returning 11191 1726776691.62083: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-cec2-986e-0000000009f4] 11191 1726776691.62089: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f4 11191 1726776691.62110: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f4 11191 1726776691.62113: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776691.62212: no more pending results, returning what we have 8218 1726776691.62215: results queue empty 8218 1726776691.62216: checking for any_errors_fatal 8218 1726776691.62222: done checking for any_errors_fatal 8218 1726776691.62222: checking for max_fail_percentage 8218 1726776691.62224: done checking for max_fail_percentage 8218 1726776691.62224: checking to see if all hosts have failed and the running result is not ok 8218 1726776691.62225: done checking to see if all hosts have failed 8218 1726776691.62226: getting the remaining hosts for this loop 8218 1726776691.62227: done getting the remaining hosts for this loop 8218 1726776691.62232: getting the next task for host managed_node2 8218 1726776691.62239: done getting next task for host managed_node2 8218 1726776691.62242: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8218 1726776691.62245: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776691.62260: getting variables 8218 1726776691.62261: in VariableManager get_vars() 8218 1726776691.62292: Calling all_inventory to load vars for managed_node2 8218 1726776691.62294: Calling groups_inventory to load vars for managed_node2 8218 1726776691.62296: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776691.62304: Calling all_plugins_play to load vars for managed_node2 8218 1726776691.62306: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776691.62309: Calling groups_plugins_play to load vars for managed_node2 8218 1726776691.62636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776691.62745: done with get_vars() 8218 1726776691.62753: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:11:31 -0400 (0:00:00.036) 0:01:17.458 **** 8218 1726776691.62807: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776691.62960: worker is 1 (out of 1 available) 8218 1726776691.62972: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776691.62983: done queuing things up, now waiting for results queue to drain 8218 1726776691.62985: waiting for pending results... 11192 1726776691.63112: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 11192 1726776691.63233: in run() - task 120fa90a-8a95-cec2-986e-0000000009f6 11192 1726776691.63248: variable 'ansible_search_path' from source: unknown 11192 1726776691.63252: variable 'ansible_search_path' from source: unknown 11192 1726776691.63281: calling self._execute() 11192 1726776691.63346: variable 'ansible_host' from source: host vars for 'managed_node2' 11192 1726776691.63355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11192 1726776691.63366: variable 'omit' from source: magic vars 11192 1726776691.63443: variable 'omit' from source: magic vars 11192 1726776691.63482: variable 'omit' from source: magic vars 11192 1726776691.63503: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11192 1726776691.63714: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11192 1726776691.63775: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11192 1726776691.63802: variable 'omit' from source: magic vars 11192 1726776691.63835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11192 1726776691.63860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11192 1726776691.63880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11192 1726776691.63893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11192 1726776691.63904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11192 1726776691.63925: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11192 1726776691.63933: variable 'ansible_host' from source: host vars for 'managed_node2' 11192 1726776691.63937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11192 1726776691.64003: Set connection var ansible_connection to ssh 11192 1726776691.64011: Set connection var ansible_pipelining to False 11192 1726776691.64017: Set connection var ansible_timeout to 10 11192 1726776691.64024: Set connection var ansible_module_compression to ZIP_DEFLATED 11192 1726776691.64031: Set connection var ansible_shell_type to sh 11192 1726776691.64037: Set connection var ansible_shell_executable to /bin/sh 11192 1726776691.64051: variable 'ansible_shell_executable' from source: unknown 11192 1726776691.64055: variable 'ansible_connection' from source: unknown 11192 1726776691.64058: variable 'ansible_module_compression' from source: unknown 11192 1726776691.64064: variable 'ansible_shell_type' from source: unknown 11192 1726776691.64067: variable 'ansible_shell_executable' from source: unknown 11192 1726776691.64070: variable 'ansible_host' from source: host vars for 'managed_node2' 11192 1726776691.64074: variable 'ansible_pipelining' from source: unknown 11192 1726776691.64077: variable 'ansible_timeout' from source: unknown 11192 1726776691.64082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11192 1726776691.64203: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11192 1726776691.64215: variable 'omit' from source: magic vars 11192 1726776691.64222: starting attempt loop 11192 1726776691.64225: running the handler 11192 1726776691.64237: _low_level_execute_command(): starting 11192 1726776691.64244: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11192 1726776691.66591: stdout chunk (state=2): >>>/root <<< 11192 1726776691.66712: stderr chunk (state=3): >>><<< 11192 1726776691.66718: stdout chunk (state=3): >>><<< 11192 1726776691.66736: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11192 1726776691.66749: _low_level_execute_command(): starting 11192 1726776691.66754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183 `" && echo ansible-tmp-1726776691.667442-11192-85060547129183="` echo /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183 `" ) && sleep 0' 11192 1726776691.69300: stdout chunk (state=2): >>>ansible-tmp-1726776691.667442-11192-85060547129183=/root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183 <<< 11192 1726776691.69424: stderr chunk (state=3): >>><<< 11192 1726776691.69431: stdout chunk (state=3): >>><<< 11192 1726776691.69445: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776691.667442-11192-85060547129183=/root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183 , stderr= 11192 1726776691.69482: variable 'ansible_module_compression' from source: unknown 11192 1726776691.69513: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11192 1726776691.69543: variable 'ansible_facts' from source: unknown 11192 1726776691.69609: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183/AnsiballZ_kernel_settings_get_config.py 11192 1726776691.69701: Sending initial data 11192 1726776691.69708: Sent initial data (172 bytes) 11192 1726776691.72154: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpq8_40w75 /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183/AnsiballZ_kernel_settings_get_config.py <<< 11192 1726776691.73179: stderr chunk (state=3): >>><<< 11192 1726776691.73185: stdout chunk (state=3): >>><<< 11192 1726776691.73203: done transferring module to remote 11192 1726776691.73213: _low_level_execute_command(): starting 11192 1726776691.73217: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183/ /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11192 1726776691.75501: stderr chunk (state=2): >>><<< 11192 1726776691.75508: stdout chunk (state=2): >>><<< 11192 1726776691.75521: _low_level_execute_command() done: rc=0, stdout=, stderr= 11192 1726776691.75525: _low_level_execute_command(): starting 11192 1726776691.75531: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11192 1726776691.91424: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 11192 1726776691.92552: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11192 1726776691.92600: stderr chunk (state=3): >>><<< 11192 1726776691.92607: stdout chunk (state=3): >>><<< 11192 1726776691.92624: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 11192 1726776691.92655: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11192 1726776691.92669: _low_level_execute_command(): starting 11192 1726776691.92675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776691.667442-11192-85060547129183/ > /dev/null 2>&1 && sleep 0' 11192 1726776691.95111: stderr chunk (state=2): >>><<< 11192 1726776691.95119: stdout chunk (state=2): >>><<< 11192 1726776691.95135: _low_level_execute_command() done: rc=0, stdout=, stderr= 11192 1726776691.95142: handler run complete 11192 1726776691.95157: attempt loop complete, returning result 11192 1726776691.95160: _execute() done 11192 1726776691.95167: dumping result to json 11192 1726776691.95172: done dumping result, returning 11192 1726776691.95180: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-cec2-986e-0000000009f6] 11192 1726776691.95188: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f6 11192 1726776691.95217: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f6 11192 1726776691.95221: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8218 1726776691.95400: no more pending results, returning what we have 8218 1726776691.95403: results queue empty 8218 1726776691.95403: checking for any_errors_fatal 8218 1726776691.95409: done checking for any_errors_fatal 8218 1726776691.95410: checking for max_fail_percentage 8218 1726776691.95411: done checking for max_fail_percentage 8218 1726776691.95412: checking to see if all hosts have failed and the running result is not ok 8218 1726776691.95413: done checking to see if all hosts have failed 8218 1726776691.95413: getting the remaining hosts for this loop 8218 1726776691.95414: done getting the remaining hosts for this loop 8218 1726776691.95418: getting the next task for host managed_node2 8218 1726776691.95424: done getting next task for host managed_node2 8218 1726776691.95427: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8218 1726776691.95430: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776691.95440: getting variables 8218 1726776691.95441: in VariableManager get_vars() 8218 1726776691.95472: Calling all_inventory to load vars for managed_node2 8218 1726776691.95474: Calling groups_inventory to load vars for managed_node2 8218 1726776691.95475: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776691.95482: Calling all_plugins_play to load vars for managed_node2 8218 1726776691.95484: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776691.95485: Calling groups_plugins_play to load vars for managed_node2 8218 1726776691.95593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776691.95714: done with get_vars() 8218 1726776691.95722: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:11:31 -0400 (0:00:00.329) 0:01:17.788 **** 8218 1726776691.95795: entering _queue_task() for managed_node2/stat 8218 1726776691.95953: worker is 1 (out of 1 available) 8218 1726776691.95969: exiting _queue_task() for managed_node2/stat 8218 1726776691.95980: done queuing things up, now waiting for results queue to drain 8218 1726776691.95982: waiting for pending results... 11203 1726776691.96105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 11203 1726776691.96227: in run() - task 120fa90a-8a95-cec2-986e-0000000009f7 11203 1726776691.96246: variable 'ansible_search_path' from source: unknown 11203 1726776691.96250: variable 'ansible_search_path' from source: unknown 11203 1726776691.96287: variable '__prof_from_conf' from source: task vars 11203 1726776691.96519: variable '__prof_from_conf' from source: task vars 11203 1726776691.96657: variable '__data' from source: task vars 11203 1726776691.96710: variable '__kernel_settings_register_tuned_main' from source: set_fact 11203 1726776691.96849: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11203 1726776691.96859: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11203 1726776691.96902: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11203 1726776691.96984: variable 'omit' from source: magic vars 11203 1726776691.97055: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726776691.97065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726776691.97075: variable 'omit' from source: magic vars 11203 1726776691.97242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11203 1726776691.98738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11203 1726776691.98790: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11203 1726776691.98819: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11203 1726776691.98848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11203 1726776691.98869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11203 1726776691.98920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11203 1726776691.98942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11203 1726776691.98960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11203 1726776691.98987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11203 1726776691.98999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11203 1726776691.99062: variable 'item' from source: unknown 11203 1726776691.99075: Evaluated conditional (item | length > 0): False 11203 1726776691.99079: when evaluation is False, skipping this task 11203 1726776691.99099: variable 'item' from source: unknown 11203 1726776691.99151: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 11203 1726776691.99221: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726776691.99233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726776691.99243: variable 'omit' from source: magic vars 11203 1726776691.99355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11203 1726776691.99374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11203 1726776691.99391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11203 1726776691.99418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11203 1726776691.99432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11203 1726776691.99482: variable 'item' from source: unknown 11203 1726776691.99491: Evaluated conditional (item | length > 0): True 11203 1726776691.99498: variable 'omit' from source: magic vars 11203 1726776691.99525: variable 'omit' from source: magic vars 11203 1726776691.99554: variable 'item' from source: unknown 11203 1726776691.99597: variable 'item' from source: unknown 11203 1726776691.99611: variable 'omit' from source: magic vars 11203 1726776691.99632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11203 1726776691.99652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11203 1726776691.99667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11203 1726776691.99680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11203 1726776691.99689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11203 1726776691.99710: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11203 1726776691.99715: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726776691.99719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726776691.99783: Set connection var ansible_connection to ssh 11203 1726776691.99791: Set connection var ansible_pipelining to False 11203 1726776691.99797: Set connection var ansible_timeout to 10 11203 1726776691.99805: Set connection var ansible_module_compression to ZIP_DEFLATED 11203 1726776691.99810: Set connection var ansible_shell_type to sh 11203 1726776691.99815: Set connection var ansible_shell_executable to /bin/sh 11203 1726776691.99830: variable 'ansible_shell_executable' from source: unknown 11203 1726776691.99834: variable 'ansible_connection' from source: unknown 11203 1726776691.99837: variable 'ansible_module_compression' from source: unknown 11203 1726776691.99840: variable 'ansible_shell_type' from source: unknown 11203 1726776691.99843: variable 'ansible_shell_executable' from source: unknown 11203 1726776691.99847: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726776691.99851: variable 'ansible_pipelining' from source: unknown 11203 1726776691.99854: variable 'ansible_timeout' from source: unknown 11203 1726776691.99859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726776691.99944: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11203 1726776691.99955: variable 'omit' from source: magic vars 11203 1726776691.99961: starting attempt loop 11203 1726776691.99964: running the handler 11203 1726776691.99975: _low_level_execute_command(): starting 11203 1726776691.99982: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11203 1726776692.02296: stdout chunk (state=2): >>>/root <<< 11203 1726776692.02416: stderr chunk (state=3): >>><<< 11203 1726776692.02422: stdout chunk (state=3): >>><<< 11203 1726776692.02442: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11203 1726776692.02453: _low_level_execute_command(): starting 11203 1726776692.02458: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721 `" && echo ansible-tmp-1726776692.0244963-11203-145016971677721="` echo /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721 `" ) && sleep 0' 11203 1726776692.05298: stdout chunk (state=2): >>>ansible-tmp-1726776692.0244963-11203-145016971677721=/root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721 <<< 11203 1726776692.05432: stderr chunk (state=3): >>><<< 11203 1726776692.05441: stdout chunk (state=3): >>><<< 11203 1726776692.05456: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776692.0244963-11203-145016971677721=/root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721 , stderr= 11203 1726776692.05492: variable 'ansible_module_compression' from source: unknown 11203 1726776692.05534: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11203 1726776692.05564: variable 'ansible_facts' from source: unknown 11203 1726776692.05628: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721/AnsiballZ_stat.py 11203 1726776692.05724: Sending initial data 11203 1726776692.05732: Sent initial data (152 bytes) 11203 1726776692.08266: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmphiasjwiq /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721/AnsiballZ_stat.py <<< 11203 1726776692.09333: stderr chunk (state=3): >>><<< 11203 1726776692.09339: stdout chunk (state=3): >>><<< 11203 1726776692.09359: done transferring module to remote 11203 1726776692.09370: _low_level_execute_command(): starting 11203 1726776692.09375: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721/ /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721/AnsiballZ_stat.py && sleep 0' 11203 1726776692.11766: stderr chunk (state=2): >>><<< 11203 1726776692.11773: stdout chunk (state=2): >>><<< 11203 1726776692.11786: _low_level_execute_command() done: rc=0, stdout=, stderr= 11203 1726776692.11790: _low_level_execute_command(): starting 11203 1726776692.11795: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721/AnsiballZ_stat.py && sleep 0' 11203 1726776692.27836: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11203 1726776692.28902: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11203 1726776692.28951: stderr chunk (state=3): >>><<< 11203 1726776692.28958: stdout chunk (state=3): >>><<< 11203 1726776692.28974: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 11203 1726776692.28995: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11203 1726776692.29006: _low_level_execute_command(): starting 11203 1726776692.29011: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776692.0244963-11203-145016971677721/ > /dev/null 2>&1 && sleep 0' 11203 1726776692.31453: stderr chunk (state=2): >>><<< 11203 1726776692.31461: stdout chunk (state=2): >>><<< 11203 1726776692.31476: _low_level_execute_command() done: rc=0, stdout=, stderr= 11203 1726776692.31483: handler run complete 11203 1726776692.31498: attempt loop complete, returning result 11203 1726776692.31514: variable 'item' from source: unknown 11203 1726776692.31575: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 11203 1726776692.31662: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726776692.31675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726776692.31685: variable 'omit' from source: magic vars 11203 1726776692.31789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11203 1726776692.31811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11203 1726776692.31832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11203 1726776692.31858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11203 1726776692.31872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11203 1726776692.31930: variable 'item' from source: unknown 11203 1726776692.31939: Evaluated conditional (item | length > 0): True 11203 1726776692.31945: variable 'omit' from source: magic vars 11203 1726776692.31956: variable 'omit' from source: magic vars 11203 1726776692.31986: variable 'item' from source: unknown 11203 1726776692.32032: variable 'item' from source: unknown 11203 1726776692.32045: variable 'omit' from source: magic vars 11203 1726776692.32062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11203 1726776692.32072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11203 1726776692.32079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11203 1726776692.32090: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11203 1726776692.32095: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726776692.32099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726776692.32152: Set connection var ansible_connection to ssh 11203 1726776692.32159: Set connection var ansible_pipelining to False 11203 1726776692.32167: Set connection var ansible_timeout to 10 11203 1726776692.32174: Set connection var ansible_module_compression to ZIP_DEFLATED 11203 1726776692.32179: Set connection var ansible_shell_type to sh 11203 1726776692.32184: Set connection var ansible_shell_executable to /bin/sh 11203 1726776692.32197: variable 'ansible_shell_executable' from source: unknown 11203 1726776692.32201: variable 'ansible_connection' from source: unknown 11203 1726776692.32204: variable 'ansible_module_compression' from source: unknown 11203 1726776692.32207: variable 'ansible_shell_type' from source: unknown 11203 1726776692.32210: variable 'ansible_shell_executable' from source: unknown 11203 1726776692.32214: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726776692.32218: variable 'ansible_pipelining' from source: unknown 11203 1726776692.32222: variable 'ansible_timeout' from source: unknown 11203 1726776692.32226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726776692.32291: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11203 1726776692.32301: variable 'omit' from source: magic vars 11203 1726776692.32306: starting attempt loop 11203 1726776692.32310: running the handler 11203 1726776692.32317: _low_level_execute_command(): starting 11203 1726776692.32321: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11203 1726776692.34477: stdout chunk (state=2): >>>/root <<< 11203 1726776692.34593: stderr chunk (state=3): >>><<< 11203 1726776692.34599: stdout chunk (state=3): >>><<< 11203 1726776692.34612: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11203 1726776692.34621: _low_level_execute_command(): starting 11203 1726776692.34626: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276 `" && echo ansible-tmp-1726776692.3461723-11203-9334449695276="` echo /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276 `" ) && sleep 0' 11203 1726776692.37137: stdout chunk (state=2): >>>ansible-tmp-1726776692.3461723-11203-9334449695276=/root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276 <<< 11203 1726776692.37265: stderr chunk (state=3): >>><<< 11203 1726776692.37272: stdout chunk (state=3): >>><<< 11203 1726776692.37289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776692.3461723-11203-9334449695276=/root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276 , stderr= 11203 1726776692.37318: variable 'ansible_module_compression' from source: unknown 11203 1726776692.37355: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11203 1726776692.37372: variable 'ansible_facts' from source: unknown 11203 1726776692.37426: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276/AnsiballZ_stat.py 11203 1726776692.37513: Sending initial data 11203 1726776692.37520: Sent initial data (150 bytes) 11203 1726776692.39937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmptcjftieg /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276/AnsiballZ_stat.py <<< 11203 1726776692.40975: stderr chunk (state=3): >>><<< 11203 1726776692.40981: stdout chunk (state=3): >>><<< 11203 1726776692.40997: done transferring module to remote 11203 1726776692.41005: _low_level_execute_command(): starting 11203 1726776692.41010: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276/ /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276/AnsiballZ_stat.py && sleep 0' 11203 1726776692.43313: stderr chunk (state=2): >>><<< 11203 1726776692.43321: stdout chunk (state=2): >>><<< 11203 1726776692.43336: _low_level_execute_command() done: rc=0, stdout=, stderr= 11203 1726776692.43340: _low_level_execute_command(): starting 11203 1726776692.43345: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276/AnsiballZ_stat.py && sleep 0' 11203 1726776692.59174: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11203 1726776692.60323: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11203 1726776692.60372: stderr chunk (state=3): >>><<< 11203 1726776692.60379: stdout chunk (state=3): >>><<< 11203 1726776692.60395: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 11203 1726776692.60430: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11203 1726776692.60439: _low_level_execute_command(): starting 11203 1726776692.60444: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776692.3461723-11203-9334449695276/ > /dev/null 2>&1 && sleep 0' 11203 1726776692.62845: stderr chunk (state=2): >>><<< 11203 1726776692.62852: stdout chunk (state=2): >>><<< 11203 1726776692.62870: _low_level_execute_command() done: rc=0, stdout=, stderr= 11203 1726776692.62876: handler run complete 11203 1726776692.62906: attempt loop complete, returning result 11203 1726776692.62921: variable 'item' from source: unknown 11203 1726776692.62983: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776634.1489303, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726776632.1399238, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726776632.1399238, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11203 1726776692.63031: dumping result to json 11203 1726776692.63042: done dumping result, returning 11203 1726776692.63050: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-cec2-986e-0000000009f7] 11203 1726776692.63056: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f7 11203 1726776692.63100: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f7 11203 1726776692.63104: WORKER PROCESS EXITING 8218 1726776692.63354: no more pending results, returning what we have 8218 1726776692.63357: results queue empty 8218 1726776692.63358: checking for any_errors_fatal 8218 1726776692.63361: done checking for any_errors_fatal 8218 1726776692.63362: checking for max_fail_percentage 8218 1726776692.63363: done checking for max_fail_percentage 8218 1726776692.63364: checking to see if all hosts have failed and the running result is not ok 8218 1726776692.63364: done checking to see if all hosts have failed 8218 1726776692.63365: getting the remaining hosts for this loop 8218 1726776692.63366: done getting the remaining hosts for this loop 8218 1726776692.63369: getting the next task for host managed_node2 8218 1726776692.63373: done getting next task for host managed_node2 8218 1726776692.63376: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8218 1726776692.63379: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776692.63387: getting variables 8218 1726776692.63388: in VariableManager get_vars() 8218 1726776692.63409: Calling all_inventory to load vars for managed_node2 8218 1726776692.63411: Calling groups_inventory to load vars for managed_node2 8218 1726776692.63412: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776692.63419: Calling all_plugins_play to load vars for managed_node2 8218 1726776692.63421: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776692.63423: Calling groups_plugins_play to load vars for managed_node2 8218 1726776692.63519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776692.63637: done with get_vars() 8218 1726776692.63645: done getting variables 8218 1726776692.63688: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:11:32 -0400 (0:00:00.679) 0:01:18.467 **** 8218 1726776692.63710: entering _queue_task() for managed_node2/set_fact 8218 1726776692.63871: worker is 1 (out of 1 available) 8218 1726776692.63886: exiting _queue_task() for managed_node2/set_fact 8218 1726776692.63898: done queuing things up, now waiting for results queue to drain 8218 1726776692.63900: waiting for pending results... 11218 1726776692.64031: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 11218 1726776692.64145: in run() - task 120fa90a-8a95-cec2-986e-0000000009f8 11218 1726776692.64161: variable 'ansible_search_path' from source: unknown 11218 1726776692.64166: variable 'ansible_search_path' from source: unknown 11218 1726776692.64194: calling self._execute() 11218 1726776692.64267: variable 'ansible_host' from source: host vars for 'managed_node2' 11218 1726776692.64276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11218 1726776692.64285: variable 'omit' from source: magic vars 11218 1726776692.64366: variable 'omit' from source: magic vars 11218 1726776692.64400: variable 'omit' from source: magic vars 11218 1726776692.64715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11218 1726776692.66273: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11218 1726776692.66321: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11218 1726776692.66350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11218 1726776692.66379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11218 1726776692.66400: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11218 1726776692.66454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11218 1726776692.66486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11218 1726776692.66507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11218 1726776692.66536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11218 1726776692.66547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11218 1726776692.66580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11218 1726776692.66597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11218 1726776692.66615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11218 1726776692.66642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11218 1726776692.66653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11218 1726776692.66692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11218 1726776692.66710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11218 1726776692.66730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11218 1726776692.66755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11218 1726776692.66768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11218 1726776692.66914: variable '__kernel_settings_find_profile_dirs' from source: set_fact 11218 1726776692.66979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11218 1726776692.67086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11218 1726776692.67113: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11218 1726776692.67136: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11218 1726776692.67158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11218 1726776692.67190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11218 1726776692.67207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11218 1726776692.67224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11218 1726776692.67243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11218 1726776692.67281: variable 'omit' from source: magic vars 11218 1726776692.67301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11218 1726776692.67320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11218 1726776692.67336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11218 1726776692.67349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11218 1726776692.67359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11218 1726776692.67383: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11218 1726776692.67388: variable 'ansible_host' from source: host vars for 'managed_node2' 11218 1726776692.67393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11218 1726776692.67458: Set connection var ansible_connection to ssh 11218 1726776692.67468: Set connection var ansible_pipelining to False 11218 1726776692.67474: Set connection var ansible_timeout to 10 11218 1726776692.67482: Set connection var ansible_module_compression to ZIP_DEFLATED 11218 1726776692.67488: Set connection var ansible_shell_type to sh 11218 1726776692.67493: Set connection var ansible_shell_executable to /bin/sh 11218 1726776692.67509: variable 'ansible_shell_executable' from source: unknown 11218 1726776692.67512: variable 'ansible_connection' from source: unknown 11218 1726776692.67516: variable 'ansible_module_compression' from source: unknown 11218 1726776692.67520: variable 'ansible_shell_type' from source: unknown 11218 1726776692.67523: variable 'ansible_shell_executable' from source: unknown 11218 1726776692.67526: variable 'ansible_host' from source: host vars for 'managed_node2' 11218 1726776692.67532: variable 'ansible_pipelining' from source: unknown 11218 1726776692.67535: variable 'ansible_timeout' from source: unknown 11218 1726776692.67539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11218 1726776692.67603: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11218 1726776692.67613: variable 'omit' from source: magic vars 11218 1726776692.67619: starting attempt loop 11218 1726776692.67622: running the handler 11218 1726776692.67633: handler run complete 11218 1726776692.67642: attempt loop complete, returning result 11218 1726776692.67645: _execute() done 11218 1726776692.67648: dumping result to json 11218 1726776692.67651: done dumping result, returning 11218 1726776692.67658: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-cec2-986e-0000000009f8] 11218 1726776692.67666: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f8 11218 1726776692.67685: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f8 11218 1726776692.67688: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8218 1726776692.67804: no more pending results, returning what we have 8218 1726776692.67807: results queue empty 8218 1726776692.67807: checking for any_errors_fatal 8218 1726776692.67819: done checking for any_errors_fatal 8218 1726776692.67821: checking for max_fail_percentage 8218 1726776692.67822: done checking for max_fail_percentage 8218 1726776692.67823: checking to see if all hosts have failed and the running result is not ok 8218 1726776692.67823: done checking to see if all hosts have failed 8218 1726776692.67824: getting the remaining hosts for this loop 8218 1726776692.67825: done getting the remaining hosts for this loop 8218 1726776692.67827: getting the next task for host managed_node2 8218 1726776692.67834: done getting next task for host managed_node2 8218 1726776692.67837: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8218 1726776692.67839: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776692.67849: getting variables 8218 1726776692.67850: in VariableManager get_vars() 8218 1726776692.67881: Calling all_inventory to load vars for managed_node2 8218 1726776692.67884: Calling groups_inventory to load vars for managed_node2 8218 1726776692.67886: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776692.67894: Calling all_plugins_play to load vars for managed_node2 8218 1726776692.67896: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776692.67898: Calling groups_plugins_play to load vars for managed_node2 8218 1726776692.68014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776692.68136: done with get_vars() 8218 1726776692.68144: done getting variables 8218 1726776692.68185: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:11:32 -0400 (0:00:00.044) 0:01:18.512 **** 8218 1726776692.68207: entering _queue_task() for managed_node2/service 8218 1726776692.68361: worker is 1 (out of 1 available) 8218 1726776692.68375: exiting _queue_task() for managed_node2/service 8218 1726776692.68386: done queuing things up, now waiting for results queue to drain 8218 1726776692.68387: waiting for pending results... 11219 1726776692.68516: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 11219 1726776692.68632: in run() - task 120fa90a-8a95-cec2-986e-0000000009f9 11219 1726776692.68647: variable 'ansible_search_path' from source: unknown 11219 1726776692.68651: variable 'ansible_search_path' from source: unknown 11219 1726776692.68686: variable '__kernel_settings_services' from source: include_vars 11219 1726776692.68983: variable '__kernel_settings_services' from source: include_vars 11219 1726776692.69038: variable 'omit' from source: magic vars 11219 1726776692.69103: variable 'ansible_host' from source: host vars for 'managed_node2' 11219 1726776692.69111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11219 1726776692.69118: variable 'omit' from source: magic vars 11219 1726776692.69171: variable 'omit' from source: magic vars 11219 1726776692.69197: variable 'omit' from source: magic vars 11219 1726776692.69224: variable 'item' from source: unknown 11219 1726776692.69282: variable 'item' from source: unknown 11219 1726776692.69303: variable 'omit' from source: magic vars 11219 1726776692.69332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11219 1726776692.69355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11219 1726776692.69375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11219 1726776692.69389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11219 1726776692.69398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11219 1726776692.69420: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11219 1726776692.69425: variable 'ansible_host' from source: host vars for 'managed_node2' 11219 1726776692.69431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11219 1726776692.69497: Set connection var ansible_connection to ssh 11219 1726776692.69505: Set connection var ansible_pipelining to False 11219 1726776692.69512: Set connection var ansible_timeout to 10 11219 1726776692.69519: Set connection var ansible_module_compression to ZIP_DEFLATED 11219 1726776692.69524: Set connection var ansible_shell_type to sh 11219 1726776692.69531: Set connection var ansible_shell_executable to /bin/sh 11219 1726776692.69544: variable 'ansible_shell_executable' from source: unknown 11219 1726776692.69548: variable 'ansible_connection' from source: unknown 11219 1726776692.69551: variable 'ansible_module_compression' from source: unknown 11219 1726776692.69555: variable 'ansible_shell_type' from source: unknown 11219 1726776692.69558: variable 'ansible_shell_executable' from source: unknown 11219 1726776692.69561: variable 'ansible_host' from source: host vars for 'managed_node2' 11219 1726776692.69567: variable 'ansible_pipelining' from source: unknown 11219 1726776692.69571: variable 'ansible_timeout' from source: unknown 11219 1726776692.69575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11219 1726776692.69662: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11219 1726776692.69675: variable 'omit' from source: magic vars 11219 1726776692.69682: starting attempt loop 11219 1726776692.69685: running the handler 11219 1726776692.69745: variable 'ansible_facts' from source: unknown 11219 1726776692.69822: _low_level_execute_command(): starting 11219 1726776692.69832: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11219 1726776692.72163: stdout chunk (state=2): >>>/root <<< 11219 1726776692.72278: stderr chunk (state=3): >>><<< 11219 1726776692.72284: stdout chunk (state=3): >>><<< 11219 1726776692.72302: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11219 1726776692.72314: _low_level_execute_command(): starting 11219 1726776692.72320: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364 `" && echo ansible-tmp-1726776692.7230961-11219-106095996198364="` echo /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364 `" ) && sleep 0' 11219 1726776692.75111: stdout chunk (state=2): >>>ansible-tmp-1726776692.7230961-11219-106095996198364=/root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364 <<< 11219 1726776692.75238: stderr chunk (state=3): >>><<< 11219 1726776692.75244: stdout chunk (state=3): >>><<< 11219 1726776692.75259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776692.7230961-11219-106095996198364=/root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364 , stderr= 11219 1726776692.75283: variable 'ansible_module_compression' from source: unknown 11219 1726776692.75323: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11219 1726776692.75372: variable 'ansible_facts' from source: unknown 11219 1726776692.75524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364/AnsiballZ_systemd.py 11219 1726776692.75630: Sending initial data 11219 1726776692.75637: Sent initial data (155 bytes) 11219 1726776692.78110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpxntljqes /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364/AnsiballZ_systemd.py <<< 11219 1726776692.80032: stderr chunk (state=3): >>><<< 11219 1726776692.80038: stdout chunk (state=3): >>><<< 11219 1726776692.80057: done transferring module to remote 11219 1726776692.80069: _low_level_execute_command(): starting 11219 1726776692.80074: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364/ /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364/AnsiballZ_systemd.py && sleep 0' 11219 1726776692.82404: stderr chunk (state=2): >>><<< 11219 1726776692.82411: stdout chunk (state=2): >>><<< 11219 1726776692.82423: _low_level_execute_command() done: rc=0, stdout=, stderr= 11219 1726776692.82427: _low_level_execute_command(): starting 11219 1726776692.82434: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364/AnsiballZ_systemd.py && sleep 0' 11219 1726776693.10331: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "22900736", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11219 1726776693.10385: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11219 1726776693.11999: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11219 1726776693.12050: stderr chunk (state=3): >>><<< 11219 1726776693.12057: stdout chunk (state=3): >>><<< 11219 1726776693.12077: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "22900736", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11219 1726776693.12198: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11219 1726776693.12217: _low_level_execute_command(): starting 11219 1726776693.12225: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776692.7230961-11219-106095996198364/ > /dev/null 2>&1 && sleep 0' 11219 1726776693.14637: stderr chunk (state=2): >>><<< 11219 1726776693.14644: stdout chunk (state=2): >>><<< 11219 1726776693.14658: _low_level_execute_command() done: rc=0, stdout=, stderr= 11219 1726776693.14665: handler run complete 11219 1726776693.14700: attempt loop complete, returning result 11219 1726776693.14716: variable 'item' from source: unknown 11219 1726776693.14778: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "659", "MemoryAccounting": "yes", "MemoryCurrent": "22900736", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "WatchdogUSec": "0" } } 11219 1726776693.14874: dumping result to json 11219 1726776693.14894: done dumping result, returning 11219 1726776693.14902: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-cec2-986e-0000000009f9] 11219 1726776693.14909: sending task result for task 120fa90a-8a95-cec2-986e-0000000009f9 11219 1726776693.15017: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009f9 11219 1726776693.15023: WORKER PROCESS EXITING 8218 1726776693.15395: no more pending results, returning what we have 8218 1726776693.15398: results queue empty 8218 1726776693.15398: checking for any_errors_fatal 8218 1726776693.15401: done checking for any_errors_fatal 8218 1726776693.15401: checking for max_fail_percentage 8218 1726776693.15402: done checking for max_fail_percentage 8218 1726776693.15403: checking to see if all hosts have failed and the running result is not ok 8218 1726776693.15403: done checking to see if all hosts have failed 8218 1726776693.15404: getting the remaining hosts for this loop 8218 1726776693.15404: done getting the remaining hosts for this loop 8218 1726776693.15407: getting the next task for host managed_node2 8218 1726776693.15411: done getting next task for host managed_node2 8218 1726776693.15413: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8218 1726776693.15415: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776693.15422: getting variables 8218 1726776693.15423: in VariableManager get_vars() 8218 1726776693.15445: Calling all_inventory to load vars for managed_node2 8218 1726776693.15447: Calling groups_inventory to load vars for managed_node2 8218 1726776693.15448: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776693.15455: Calling all_plugins_play to load vars for managed_node2 8218 1726776693.15456: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776693.15458: Calling groups_plugins_play to load vars for managed_node2 8218 1726776693.15556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776693.15671: done with get_vars() 8218 1726776693.15679: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:11:33 -0400 (0:00:00.475) 0:01:18.988 **** 8218 1726776693.15748: entering _queue_task() for managed_node2/file 8218 1726776693.15909: worker is 1 (out of 1 available) 8218 1726776693.15925: exiting _queue_task() for managed_node2/file 8218 1726776693.15938: done queuing things up, now waiting for results queue to drain 8218 1726776693.15940: waiting for pending results... 11230 1726776693.16066: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 11230 1726776693.16182: in run() - task 120fa90a-8a95-cec2-986e-0000000009fa 11230 1726776693.16198: variable 'ansible_search_path' from source: unknown 11230 1726776693.16202: variable 'ansible_search_path' from source: unknown 11230 1726776693.16230: calling self._execute() 11230 1726776693.16295: variable 'ansible_host' from source: host vars for 'managed_node2' 11230 1726776693.16304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11230 1726776693.16313: variable 'omit' from source: magic vars 11230 1726776693.16390: variable 'omit' from source: magic vars 11230 1726776693.16426: variable 'omit' from source: magic vars 11230 1726776693.16448: variable '__kernel_settings_profile_dir' from source: role '' all vars 11230 1726776693.16661: variable '__kernel_settings_profile_dir' from source: role '' all vars 11230 1726776693.16733: variable '__kernel_settings_profile_parent' from source: set_fact 11230 1726776693.16741: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11230 1726776693.16790: variable 'omit' from source: magic vars 11230 1726776693.16819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11230 1726776693.16848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11230 1726776693.16866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11230 1726776693.16880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11230 1726776693.16891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11230 1726776693.16911: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11230 1726776693.16916: variable 'ansible_host' from source: host vars for 'managed_node2' 11230 1726776693.16920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11230 1726776693.17001: Set connection var ansible_connection to ssh 11230 1726776693.17009: Set connection var ansible_pipelining to False 11230 1726776693.17016: Set connection var ansible_timeout to 10 11230 1726776693.17023: Set connection var ansible_module_compression to ZIP_DEFLATED 11230 1726776693.17030: Set connection var ansible_shell_type to sh 11230 1726776693.17037: Set connection var ansible_shell_executable to /bin/sh 11230 1726776693.17052: variable 'ansible_shell_executable' from source: unknown 11230 1726776693.17055: variable 'ansible_connection' from source: unknown 11230 1726776693.17059: variable 'ansible_module_compression' from source: unknown 11230 1726776693.17062: variable 'ansible_shell_type' from source: unknown 11230 1726776693.17065: variable 'ansible_shell_executable' from source: unknown 11230 1726776693.17068: variable 'ansible_host' from source: host vars for 'managed_node2' 11230 1726776693.17072: variable 'ansible_pipelining' from source: unknown 11230 1726776693.17075: variable 'ansible_timeout' from source: unknown 11230 1726776693.17079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11230 1726776693.17212: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11230 1726776693.17223: variable 'omit' from source: magic vars 11230 1726776693.17232: starting attempt loop 11230 1726776693.17235: running the handler 11230 1726776693.17247: _low_level_execute_command(): starting 11230 1726776693.17255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11230 1726776693.19542: stdout chunk (state=2): >>>/root <<< 11230 1726776693.19658: stderr chunk (state=3): >>><<< 11230 1726776693.19664: stdout chunk (state=3): >>><<< 11230 1726776693.19680: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11230 1726776693.19692: _low_level_execute_command(): starting 11230 1726776693.19697: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780 `" && echo ansible-tmp-1726776693.1968763-11230-68745314286780="` echo /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780 `" ) && sleep 0' 11230 1726776693.22312: stdout chunk (state=2): >>>ansible-tmp-1726776693.1968763-11230-68745314286780=/root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780 <<< 11230 1726776693.22433: stderr chunk (state=3): >>><<< 11230 1726776693.22444: stdout chunk (state=3): >>><<< 11230 1726776693.22455: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776693.1968763-11230-68745314286780=/root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780 , stderr= 11230 1726776693.22485: variable 'ansible_module_compression' from source: unknown 11230 1726776693.22525: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11230 1726776693.22556: variable 'ansible_facts' from source: unknown 11230 1726776693.22622: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780/AnsiballZ_file.py 11230 1726776693.22711: Sending initial data 11230 1726776693.22718: Sent initial data (151 bytes) 11230 1726776693.25193: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp933kkhlb /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780/AnsiballZ_file.py <<< 11230 1726776693.26272: stderr chunk (state=3): >>><<< 11230 1726776693.26278: stdout chunk (state=3): >>><<< 11230 1726776693.26295: done transferring module to remote 11230 1726776693.26305: _low_level_execute_command(): starting 11230 1726776693.26310: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780/ /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780/AnsiballZ_file.py && sleep 0' 11230 1726776693.28602: stderr chunk (state=2): >>><<< 11230 1726776693.28609: stdout chunk (state=2): >>><<< 11230 1726776693.28622: _low_level_execute_command() done: rc=0, stdout=, stderr= 11230 1726776693.28626: _low_level_execute_command(): starting 11230 1726776693.28632: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780/AnsiballZ_file.py && sleep 0' 11230 1726776693.44623: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11230 1726776693.45752: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11230 1726776693.45806: stderr chunk (state=3): >>><<< 11230 1726776693.45813: stdout chunk (state=3): >>><<< 11230 1726776693.45832: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11230 1726776693.45867: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11230 1726776693.45878: _low_level_execute_command(): starting 11230 1726776693.45884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776693.1968763-11230-68745314286780/ > /dev/null 2>&1 && sleep 0' 11230 1726776693.48331: stderr chunk (state=2): >>><<< 11230 1726776693.48339: stdout chunk (state=2): >>><<< 11230 1726776693.48352: _low_level_execute_command() done: rc=0, stdout=, stderr= 11230 1726776693.48359: handler run complete 11230 1726776693.48379: attempt loop complete, returning result 11230 1726776693.48383: _execute() done 11230 1726776693.48386: dumping result to json 11230 1726776693.48392: done dumping result, returning 11230 1726776693.48400: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-cec2-986e-0000000009fa] 11230 1726776693.48406: sending task result for task 120fa90a-8a95-cec2-986e-0000000009fa 11230 1726776693.48440: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009fa 11230 1726776693.48444: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8218 1726776693.48592: no more pending results, returning what we have 8218 1726776693.48595: results queue empty 8218 1726776693.48596: checking for any_errors_fatal 8218 1726776693.48610: done checking for any_errors_fatal 8218 1726776693.48610: checking for max_fail_percentage 8218 1726776693.48612: done checking for max_fail_percentage 8218 1726776693.48612: checking to see if all hosts have failed and the running result is not ok 8218 1726776693.48613: done checking to see if all hosts have failed 8218 1726776693.48614: getting the remaining hosts for this loop 8218 1726776693.48615: done getting the remaining hosts for this loop 8218 1726776693.48618: getting the next task for host managed_node2 8218 1726776693.48624: done getting next task for host managed_node2 8218 1726776693.48627: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8218 1726776693.48631: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776693.48642: getting variables 8218 1726776693.48643: in VariableManager get_vars() 8218 1726776693.48676: Calling all_inventory to load vars for managed_node2 8218 1726776693.48679: Calling groups_inventory to load vars for managed_node2 8218 1726776693.48681: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776693.48689: Calling all_plugins_play to load vars for managed_node2 8218 1726776693.48691: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776693.48694: Calling groups_plugins_play to load vars for managed_node2 8218 1726776693.48804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776693.48922: done with get_vars() 8218 1726776693.48932: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:11:33 -0400 (0:00:00.332) 0:01:19.320 **** 8218 1726776693.48999: entering _queue_task() for managed_node2/slurp 8218 1726776693.49163: worker is 1 (out of 1 available) 8218 1726776693.49176: exiting _queue_task() for managed_node2/slurp 8218 1726776693.49187: done queuing things up, now waiting for results queue to drain 8218 1726776693.49190: waiting for pending results... 11238 1726776693.49322: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 11238 1726776693.49440: in run() - task 120fa90a-8a95-cec2-986e-0000000009fb 11238 1726776693.49456: variable 'ansible_search_path' from source: unknown 11238 1726776693.49460: variable 'ansible_search_path' from source: unknown 11238 1726776693.49489: calling self._execute() 11238 1726776693.49555: variable 'ansible_host' from source: host vars for 'managed_node2' 11238 1726776693.49566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11238 1726776693.49575: variable 'omit' from source: magic vars 11238 1726776693.49653: variable 'omit' from source: magic vars 11238 1726776693.49689: variable 'omit' from source: magic vars 11238 1726776693.49710: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11238 1726776693.49932: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11238 1726776693.49999: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11238 1726776693.50030: variable 'omit' from source: magic vars 11238 1726776693.50066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11238 1726776693.50153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11238 1726776693.50174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11238 1726776693.50189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11238 1726776693.50200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11238 1726776693.50224: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11238 1726776693.50230: variable 'ansible_host' from source: host vars for 'managed_node2' 11238 1726776693.50234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11238 1726776693.50304: Set connection var ansible_connection to ssh 11238 1726776693.50312: Set connection var ansible_pipelining to False 11238 1726776693.50318: Set connection var ansible_timeout to 10 11238 1726776693.50325: Set connection var ansible_module_compression to ZIP_DEFLATED 11238 1726776693.50332: Set connection var ansible_shell_type to sh 11238 1726776693.50338: Set connection var ansible_shell_executable to /bin/sh 11238 1726776693.50353: variable 'ansible_shell_executable' from source: unknown 11238 1726776693.50357: variable 'ansible_connection' from source: unknown 11238 1726776693.50360: variable 'ansible_module_compression' from source: unknown 11238 1726776693.50363: variable 'ansible_shell_type' from source: unknown 11238 1726776693.50369: variable 'ansible_shell_executable' from source: unknown 11238 1726776693.50372: variable 'ansible_host' from source: host vars for 'managed_node2' 11238 1726776693.50376: variable 'ansible_pipelining' from source: unknown 11238 1726776693.50379: variable 'ansible_timeout' from source: unknown 11238 1726776693.50384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11238 1726776693.50523: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11238 1726776693.50536: variable 'omit' from source: magic vars 11238 1726776693.50542: starting attempt loop 11238 1726776693.50545: running the handler 11238 1726776693.50557: _low_level_execute_command(): starting 11238 1726776693.50564: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11238 1726776693.52820: stdout chunk (state=2): >>>/root <<< 11238 1726776693.52941: stderr chunk (state=3): >>><<< 11238 1726776693.52949: stdout chunk (state=3): >>><<< 11238 1726776693.52969: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11238 1726776693.52983: _low_level_execute_command(): starting 11238 1726776693.52989: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075 `" && echo ansible-tmp-1726776693.5297794-11238-15667538198075="` echo /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075 `" ) && sleep 0' 11238 1726776693.55527: stdout chunk (state=2): >>>ansible-tmp-1726776693.5297794-11238-15667538198075=/root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075 <<< 11238 1726776693.55653: stderr chunk (state=3): >>><<< 11238 1726776693.55660: stdout chunk (state=3): >>><<< 11238 1726776693.55676: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776693.5297794-11238-15667538198075=/root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075 , stderr= 11238 1726776693.55711: variable 'ansible_module_compression' from source: unknown 11238 1726776693.55747: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 11238 1726776693.55775: variable 'ansible_facts' from source: unknown 11238 1726776693.55848: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075/AnsiballZ_slurp.py 11238 1726776693.55945: Sending initial data 11238 1726776693.55952: Sent initial data (152 bytes) 11238 1726776693.58393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp0v1htqnw /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075/AnsiballZ_slurp.py <<< 11238 1726776693.59434: stderr chunk (state=3): >>><<< 11238 1726776693.59440: stdout chunk (state=3): >>><<< 11238 1726776693.59457: done transferring module to remote 11238 1726776693.59468: _low_level_execute_command(): starting 11238 1726776693.59473: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075/ /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075/AnsiballZ_slurp.py && sleep 0' 11238 1726776693.61761: stderr chunk (state=2): >>><<< 11238 1726776693.61768: stdout chunk (state=2): >>><<< 11238 1726776693.61781: _low_level_execute_command() done: rc=0, stdout=, stderr= 11238 1726776693.61785: _low_level_execute_command(): starting 11238 1726776693.61789: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075/AnsiballZ_slurp.py && sleep 0' 11238 1726776693.76589: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11238 1726776693.77579: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11238 1726776693.77626: stderr chunk (state=3): >>><<< 11238 1726776693.77635: stdout chunk (state=3): >>><<< 11238 1726776693.77651: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.12.75 closed. 11238 1726776693.77679: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11238 1726776693.77690: _low_level_execute_command(): starting 11238 1726776693.77696: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776693.5297794-11238-15667538198075/ > /dev/null 2>&1 && sleep 0' 11238 1726776693.80102: stderr chunk (state=2): >>><<< 11238 1726776693.80109: stdout chunk (state=2): >>><<< 11238 1726776693.80121: _low_level_execute_command() done: rc=0, stdout=, stderr= 11238 1726776693.80128: handler run complete 11238 1726776693.80143: attempt loop complete, returning result 11238 1726776693.80147: _execute() done 11238 1726776693.80150: dumping result to json 11238 1726776693.80155: done dumping result, returning 11238 1726776693.80162: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-cec2-986e-0000000009fb] 11238 1726776693.80170: sending task result for task 120fa90a-8a95-cec2-986e-0000000009fb 11238 1726776693.80199: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009fb 11238 1726776693.80203: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8218 1726776693.80389: no more pending results, returning what we have 8218 1726776693.80392: results queue empty 8218 1726776693.80393: checking for any_errors_fatal 8218 1726776693.80402: done checking for any_errors_fatal 8218 1726776693.80402: checking for max_fail_percentage 8218 1726776693.80404: done checking for max_fail_percentage 8218 1726776693.80404: checking to see if all hosts have failed and the running result is not ok 8218 1726776693.80405: done checking to see if all hosts have failed 8218 1726776693.80406: getting the remaining hosts for this loop 8218 1726776693.80407: done getting the remaining hosts for this loop 8218 1726776693.80410: getting the next task for host managed_node2 8218 1726776693.80416: done getting next task for host managed_node2 8218 1726776693.80419: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8218 1726776693.80421: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776693.80432: getting variables 8218 1726776693.80433: in VariableManager get_vars() 8218 1726776693.80459: Calling all_inventory to load vars for managed_node2 8218 1726776693.80461: Calling groups_inventory to load vars for managed_node2 8218 1726776693.80462: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776693.80469: Calling all_plugins_play to load vars for managed_node2 8218 1726776693.80471: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776693.80472: Calling groups_plugins_play to load vars for managed_node2 8218 1726776693.80581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776693.80743: done with get_vars() 8218 1726776693.80751: done getting variables 8218 1726776693.80793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:11:33 -0400 (0:00:00.318) 0:01:19.638 **** 8218 1726776693.80817: entering _queue_task() for managed_node2/set_fact 8218 1726776693.80969: worker is 1 (out of 1 available) 8218 1726776693.80983: exiting _queue_task() for managed_node2/set_fact 8218 1726776693.80995: done queuing things up, now waiting for results queue to drain 8218 1726776693.80997: waiting for pending results... 11246 1726776693.81133: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 11246 1726776693.81251: in run() - task 120fa90a-8a95-cec2-986e-0000000009fc 11246 1726776693.81271: variable 'ansible_search_path' from source: unknown 11246 1726776693.81275: variable 'ansible_search_path' from source: unknown 11246 1726776693.81301: calling self._execute() 11246 1726776693.81372: variable 'ansible_host' from source: host vars for 'managed_node2' 11246 1726776693.81381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11246 1726776693.81390: variable 'omit' from source: magic vars 11246 1726776693.81471: variable 'omit' from source: magic vars 11246 1726776693.81507: variable 'omit' from source: magic vars 11246 1726776693.81800: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11246 1726776693.81811: variable '__cur_profile' from source: task vars 11246 1726776693.81916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11246 1726776693.83418: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11246 1726776693.83472: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11246 1726776693.83501: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11246 1726776693.83527: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11246 1726776693.83548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11246 1726776693.83601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11246 1726776693.83622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11246 1726776693.83642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11246 1726776693.83669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11246 1726776693.83681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11246 1726776693.83753: variable '__kernel_settings_tuned_current_profile' from source: set_fact 11246 1726776693.83793: variable 'omit' from source: magic vars 11246 1726776693.83814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11246 1726776693.83836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11246 1726776693.83851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11246 1726776693.83864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11246 1726776693.83874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11246 1726776693.83896: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11246 1726776693.83901: variable 'ansible_host' from source: host vars for 'managed_node2' 11246 1726776693.83906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11246 1726776693.83971: Set connection var ansible_connection to ssh 11246 1726776693.83979: Set connection var ansible_pipelining to False 11246 1726776693.83985: Set connection var ansible_timeout to 10 11246 1726776693.83992: Set connection var ansible_module_compression to ZIP_DEFLATED 11246 1726776693.83997: Set connection var ansible_shell_type to sh 11246 1726776693.84002: Set connection var ansible_shell_executable to /bin/sh 11246 1726776693.84017: variable 'ansible_shell_executable' from source: unknown 11246 1726776693.84021: variable 'ansible_connection' from source: unknown 11246 1726776693.84024: variable 'ansible_module_compression' from source: unknown 11246 1726776693.84027: variable 'ansible_shell_type' from source: unknown 11246 1726776693.84032: variable 'ansible_shell_executable' from source: unknown 11246 1726776693.84035: variable 'ansible_host' from source: host vars for 'managed_node2' 11246 1726776693.84040: variable 'ansible_pipelining' from source: unknown 11246 1726776693.84043: variable 'ansible_timeout' from source: unknown 11246 1726776693.84047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11246 1726776693.84105: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11246 1726776693.84121: variable 'omit' from source: magic vars 11246 1726776693.84127: starting attempt loop 11246 1726776693.84132: running the handler 11246 1726776693.84142: handler run complete 11246 1726776693.84151: attempt loop complete, returning result 11246 1726776693.84154: _execute() done 11246 1726776693.84156: dumping result to json 11246 1726776693.84160: done dumping result, returning 11246 1726776693.84166: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-cec2-986e-0000000009fc] 11246 1726776693.84173: sending task result for task 120fa90a-8a95-cec2-986e-0000000009fc 11246 1726776693.84192: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009fc 11246 1726776693.84195: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8218 1726776693.84315: no more pending results, returning what we have 8218 1726776693.84318: results queue empty 8218 1726776693.84318: checking for any_errors_fatal 8218 1726776693.84324: done checking for any_errors_fatal 8218 1726776693.84324: checking for max_fail_percentage 8218 1726776693.84326: done checking for max_fail_percentage 8218 1726776693.84326: checking to see if all hosts have failed and the running result is not ok 8218 1726776693.84327: done checking to see if all hosts have failed 8218 1726776693.84328: getting the remaining hosts for this loop 8218 1726776693.84330: done getting the remaining hosts for this loop 8218 1726776693.84333: getting the next task for host managed_node2 8218 1726776693.84338: done getting next task for host managed_node2 8218 1726776693.84341: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8218 1726776693.84343: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776693.84358: getting variables 8218 1726776693.84359: in VariableManager get_vars() 8218 1726776693.84392: Calling all_inventory to load vars for managed_node2 8218 1726776693.84394: Calling groups_inventory to load vars for managed_node2 8218 1726776693.84396: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776693.84404: Calling all_plugins_play to load vars for managed_node2 8218 1726776693.84406: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776693.84408: Calling groups_plugins_play to load vars for managed_node2 8218 1726776693.84516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776693.84637: done with get_vars() 8218 1726776693.84645: done getting variables 8218 1726776693.84687: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:11:33 -0400 (0:00:00.038) 0:01:19.677 **** 8218 1726776693.84709: entering _queue_task() for managed_node2/copy 8218 1726776693.84867: worker is 1 (out of 1 available) 8218 1726776693.84881: exiting _queue_task() for managed_node2/copy 8218 1726776693.84891: done queuing things up, now waiting for results queue to drain 8218 1726776693.84893: waiting for pending results... 11247 1726776693.85015: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 11247 1726776693.85127: in run() - task 120fa90a-8a95-cec2-986e-0000000009fd 11247 1726776693.85143: variable 'ansible_search_path' from source: unknown 11247 1726776693.85147: variable 'ansible_search_path' from source: unknown 11247 1726776693.85174: calling self._execute() 11247 1726776693.85239: variable 'ansible_host' from source: host vars for 'managed_node2' 11247 1726776693.85249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11247 1726776693.85258: variable 'omit' from source: magic vars 11247 1726776693.85332: variable 'omit' from source: magic vars 11247 1726776693.85368: variable 'omit' from source: magic vars 11247 1726776693.85389: variable '__kernel_settings_active_profile' from source: set_fact 11247 1726776693.85596: variable '__kernel_settings_active_profile' from source: set_fact 11247 1726776693.85618: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11247 1726776693.85672: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11247 1726776693.85725: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11247 1726776693.85806: variable 'omit' from source: magic vars 11247 1726776693.85839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11247 1726776693.85863: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11247 1726776693.85881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11247 1726776693.85894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11247 1726776693.85904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11247 1726776693.85926: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11247 1726776693.85932: variable 'ansible_host' from source: host vars for 'managed_node2' 11247 1726776693.85935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11247 1726776693.85997: Set connection var ansible_connection to ssh 11247 1726776693.86002: Set connection var ansible_pipelining to False 11247 1726776693.86006: Set connection var ansible_timeout to 10 11247 1726776693.86011: Set connection var ansible_module_compression to ZIP_DEFLATED 11247 1726776693.86014: Set connection var ansible_shell_type to sh 11247 1726776693.86017: Set connection var ansible_shell_executable to /bin/sh 11247 1726776693.86042: variable 'ansible_shell_executable' from source: unknown 11247 1726776693.86046: variable 'ansible_connection' from source: unknown 11247 1726776693.86050: variable 'ansible_module_compression' from source: unknown 11247 1726776693.86053: variable 'ansible_shell_type' from source: unknown 11247 1726776693.86057: variable 'ansible_shell_executable' from source: unknown 11247 1726776693.86060: variable 'ansible_host' from source: host vars for 'managed_node2' 11247 1726776693.86064: variable 'ansible_pipelining' from source: unknown 11247 1726776693.86068: variable 'ansible_timeout' from source: unknown 11247 1726776693.86072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11247 1726776693.86157: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11247 1726776693.86170: variable 'omit' from source: magic vars 11247 1726776693.86176: starting attempt loop 11247 1726776693.86180: running the handler 11247 1726776693.86190: _low_level_execute_command(): starting 11247 1726776693.86198: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11247 1726776693.88500: stdout chunk (state=2): >>>/root <<< 11247 1726776693.88616: stderr chunk (state=3): >>><<< 11247 1726776693.88623: stdout chunk (state=3): >>><<< 11247 1726776693.88641: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11247 1726776693.88654: _low_level_execute_command(): starting 11247 1726776693.88659: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173 `" && echo ansible-tmp-1726776693.8864908-11247-121241471760173="` echo /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173 `" ) && sleep 0' 11247 1726776693.91176: stdout chunk (state=2): >>>ansible-tmp-1726776693.8864908-11247-121241471760173=/root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173 <<< 11247 1726776693.91298: stderr chunk (state=3): >>><<< 11247 1726776693.91304: stdout chunk (state=3): >>><<< 11247 1726776693.91318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776693.8864908-11247-121241471760173=/root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173 , stderr= 11247 1726776693.91388: variable 'ansible_module_compression' from source: unknown 11247 1726776693.91433: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11247 1726776693.91461: variable 'ansible_facts' from source: unknown 11247 1726776693.91525: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_stat.py 11247 1726776693.91606: Sending initial data 11247 1726776693.91613: Sent initial data (152 bytes) 11247 1726776693.94037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmptsqa8vhk /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_stat.py <<< 11247 1726776693.95074: stderr chunk (state=3): >>><<< 11247 1726776693.95082: stdout chunk (state=3): >>><<< 11247 1726776693.95099: done transferring module to remote 11247 1726776693.95109: _low_level_execute_command(): starting 11247 1726776693.95114: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/ /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_stat.py && sleep 0' 11247 1726776693.97417: stderr chunk (state=2): >>><<< 11247 1726776693.97424: stdout chunk (state=2): >>><<< 11247 1726776693.97438: _low_level_execute_command() done: rc=0, stdout=, stderr= 11247 1726776693.97442: _low_level_execute_command(): starting 11247 1726776693.97447: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_stat.py && sleep 0' 11247 1726776694.13565: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776693.7635617, "mtime": 1726776686.0835326, "ctime": 1726776686.0835326, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11247 1726776694.14751: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11247 1726776694.14798: stderr chunk (state=3): >>><<< 11247 1726776694.14805: stdout chunk (state=3): >>><<< 11247 1726776694.14822: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776693.7635617, "mtime": 1726776686.0835326, "ctime": 1726776686.0835326, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11247 1726776694.14866: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11247 1726776694.14905: variable 'ansible_module_compression' from source: unknown 11247 1726776694.14939: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11247 1726776694.14959: variable 'ansible_facts' from source: unknown 11247 1726776694.15017: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_file.py 11247 1726776694.15111: Sending initial data 11247 1726776694.15118: Sent initial data (152 bytes) 11247 1726776694.17662: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpgwphafk4 /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_file.py <<< 11247 1726776694.18758: stderr chunk (state=3): >>><<< 11247 1726776694.18764: stdout chunk (state=3): >>><<< 11247 1726776694.18781: done transferring module to remote 11247 1726776694.18790: _low_level_execute_command(): starting 11247 1726776694.18794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/ /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_file.py && sleep 0' 11247 1726776694.21162: stderr chunk (state=2): >>><<< 11247 1726776694.21171: stdout chunk (state=2): >>><<< 11247 1726776694.21185: _low_level_execute_command() done: rc=0, stdout=, stderr= 11247 1726776694.21189: _low_level_execute_command(): starting 11247 1726776694.21194: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/AnsiballZ_file.py && sleep 0' 11247 1726776694.37249: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpbxzmbadq", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11247 1726776694.38371: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11247 1726776694.38417: stderr chunk (state=3): >>><<< 11247 1726776694.38424: stdout chunk (state=3): >>><<< 11247 1726776694.38442: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpbxzmbadq", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11247 1726776694.38472: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpbxzmbadq', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11247 1726776694.38483: _low_level_execute_command(): starting 11247 1726776694.38489: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776693.8864908-11247-121241471760173/ > /dev/null 2>&1 && sleep 0' 11247 1726776694.40925: stderr chunk (state=2): >>><<< 11247 1726776694.40933: stdout chunk (state=2): >>><<< 11247 1726776694.40946: _low_level_execute_command() done: rc=0, stdout=, stderr= 11247 1726776694.40954: handler run complete 11247 1726776694.40976: attempt loop complete, returning result 11247 1726776694.40980: _execute() done 11247 1726776694.40983: dumping result to json 11247 1726776694.40988: done dumping result, returning 11247 1726776694.40996: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-cec2-986e-0000000009fd] 11247 1726776694.41003: sending task result for task 120fa90a-8a95-cec2-986e-0000000009fd 11247 1726776694.41038: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009fd 11247 1726776694.41042: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8218 1726776694.41202: no more pending results, returning what we have 8218 1726776694.41206: results queue empty 8218 1726776694.41206: checking for any_errors_fatal 8218 1726776694.41211: done checking for any_errors_fatal 8218 1726776694.41212: checking for max_fail_percentage 8218 1726776694.41214: done checking for max_fail_percentage 8218 1726776694.41214: checking to see if all hosts have failed and the running result is not ok 8218 1726776694.41215: done checking to see if all hosts have failed 8218 1726776694.41215: getting the remaining hosts for this loop 8218 1726776694.41217: done getting the remaining hosts for this loop 8218 1726776694.41219: getting the next task for host managed_node2 8218 1726776694.41225: done getting next task for host managed_node2 8218 1726776694.41228: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8218 1726776694.41232: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776694.41242: getting variables 8218 1726776694.41243: in VariableManager get_vars() 8218 1726776694.41276: Calling all_inventory to load vars for managed_node2 8218 1726776694.41279: Calling groups_inventory to load vars for managed_node2 8218 1726776694.41280: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776694.41288: Calling all_plugins_play to load vars for managed_node2 8218 1726776694.41291: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776694.41293: Calling groups_plugins_play to load vars for managed_node2 8218 1726776694.41442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776694.41558: done with get_vars() 8218 1726776694.41567: done getting variables 8218 1726776694.41608: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:11:34 -0400 (0:00:00.569) 0:01:20.246 **** 8218 1726776694.41633: entering _queue_task() for managed_node2/copy 8218 1726776694.41787: worker is 1 (out of 1 available) 8218 1726776694.41801: exiting _queue_task() for managed_node2/copy 8218 1726776694.41813: done queuing things up, now waiting for results queue to drain 8218 1726776694.41814: waiting for pending results... 11262 1726776694.41951: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 11262 1726776694.42062: in run() - task 120fa90a-8a95-cec2-986e-0000000009fe 11262 1726776694.42081: variable 'ansible_search_path' from source: unknown 11262 1726776694.42085: variable 'ansible_search_path' from source: unknown 11262 1726776694.42111: calling self._execute() 11262 1726776694.42179: variable 'ansible_host' from source: host vars for 'managed_node2' 11262 1726776694.42190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11262 1726776694.42198: variable 'omit' from source: magic vars 11262 1726776694.42277: variable 'omit' from source: magic vars 11262 1726776694.42310: variable 'omit' from source: magic vars 11262 1726776694.42333: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11262 1726776694.42550: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11262 1726776694.42612: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11262 1726776694.42644: variable 'omit' from source: magic vars 11262 1726776694.42678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11262 1726776694.42702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11262 1726776694.42717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11262 1726776694.42728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11262 1726776694.42742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11262 1726776694.42766: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11262 1726776694.42772: variable 'ansible_host' from source: host vars for 'managed_node2' 11262 1726776694.42776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11262 1726776694.42844: Set connection var ansible_connection to ssh 11262 1726776694.42852: Set connection var ansible_pipelining to False 11262 1726776694.42859: Set connection var ansible_timeout to 10 11262 1726776694.42868: Set connection var ansible_module_compression to ZIP_DEFLATED 11262 1726776694.42873: Set connection var ansible_shell_type to sh 11262 1726776694.42879: Set connection var ansible_shell_executable to /bin/sh 11262 1726776694.42893: variable 'ansible_shell_executable' from source: unknown 11262 1726776694.42897: variable 'ansible_connection' from source: unknown 11262 1726776694.42901: variable 'ansible_module_compression' from source: unknown 11262 1726776694.42904: variable 'ansible_shell_type' from source: unknown 11262 1726776694.42906: variable 'ansible_shell_executable' from source: unknown 11262 1726776694.42908: variable 'ansible_host' from source: host vars for 'managed_node2' 11262 1726776694.42910: variable 'ansible_pipelining' from source: unknown 11262 1726776694.42912: variable 'ansible_timeout' from source: unknown 11262 1726776694.42914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11262 1726776694.43005: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11262 1726776694.43017: variable 'omit' from source: magic vars 11262 1726776694.43024: starting attempt loop 11262 1726776694.43029: running the handler 11262 1726776694.43040: _low_level_execute_command(): starting 11262 1726776694.43046: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11262 1726776694.45375: stdout chunk (state=2): >>>/root <<< 11262 1726776694.45494: stderr chunk (state=3): >>><<< 11262 1726776694.45501: stdout chunk (state=3): >>><<< 11262 1726776694.45517: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11262 1726776694.45531: _low_level_execute_command(): starting 11262 1726776694.45537: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977 `" && echo ansible-tmp-1726776694.455244-11262-42456504946977="` echo /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977 `" ) && sleep 0' 11262 1726776694.48108: stdout chunk (state=2): >>>ansible-tmp-1726776694.455244-11262-42456504946977=/root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977 <<< 11262 1726776694.48237: stderr chunk (state=3): >>><<< 11262 1726776694.48244: stdout chunk (state=3): >>><<< 11262 1726776694.48259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776694.455244-11262-42456504946977=/root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977 , stderr= 11262 1726776694.48331: variable 'ansible_module_compression' from source: unknown 11262 1726776694.48378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11262 1726776694.48408: variable 'ansible_facts' from source: unknown 11262 1726776694.48476: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_stat.py 11262 1726776694.48557: Sending initial data 11262 1726776694.48566: Sent initial data (150 bytes) 11262 1726776694.51064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmptiavgjc6 /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_stat.py <<< 11262 1726776694.52138: stderr chunk (state=3): >>><<< 11262 1726776694.52149: stdout chunk (state=3): >>><<< 11262 1726776694.52171: done transferring module to remote 11262 1726776694.52183: _low_level_execute_command(): starting 11262 1726776694.52188: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/ /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_stat.py && sleep 0' 11262 1726776694.54553: stderr chunk (state=2): >>><<< 11262 1726776694.54564: stdout chunk (state=2): >>><<< 11262 1726776694.54581: _low_level_execute_command() done: rc=0, stdout=, stderr= 11262 1726776694.54588: _low_level_execute_command(): starting 11262 1726776694.54594: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_stat.py && sleep 0' 11262 1726776694.70499: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776684.2785256, "mtime": 1726776686.0835326, "ctime": 1726776686.0835326, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11262 1726776694.71619: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11262 1726776694.71671: stderr chunk (state=3): >>><<< 11262 1726776694.71677: stdout chunk (state=3): >>><<< 11262 1726776694.71695: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776684.2785256, "mtime": 1726776686.0835326, "ctime": 1726776686.0835326, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11262 1726776694.71739: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11262 1726776694.71777: variable 'ansible_module_compression' from source: unknown 11262 1726776694.71809: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11262 1726776694.71831: variable 'ansible_facts' from source: unknown 11262 1726776694.71887: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_file.py 11262 1726776694.71979: Sending initial data 11262 1726776694.71986: Sent initial data (150 bytes) 11262 1726776694.74511: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpilvfxjg5 /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_file.py <<< 11262 1726776694.75609: stderr chunk (state=3): >>><<< 11262 1726776694.75617: stdout chunk (state=3): >>><<< 11262 1726776694.75637: done transferring module to remote 11262 1726776694.75646: _low_level_execute_command(): starting 11262 1726776694.75651: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/ /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_file.py && sleep 0' 11262 1726776694.78014: stderr chunk (state=2): >>><<< 11262 1726776694.78021: stdout chunk (state=2): >>><<< 11262 1726776694.78036: _low_level_execute_command() done: rc=0, stdout=, stderr= 11262 1726776694.78040: _low_level_execute_command(): starting 11262 1726776694.78046: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/AnsiballZ_file.py && sleep 0' 11262 1726776694.94076: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpbd3aqjhq", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11262 1726776694.95172: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11262 1726776694.95220: stderr chunk (state=3): >>><<< 11262 1726776694.95226: stdout chunk (state=3): >>><<< 11262 1726776694.95244: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpbd3aqjhq", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11262 1726776694.95272: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpbd3aqjhq', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11262 1726776694.95283: _low_level_execute_command(): starting 11262 1726776694.95289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776694.455244-11262-42456504946977/ > /dev/null 2>&1 && sleep 0' 11262 1726776694.97719: stderr chunk (state=2): >>><<< 11262 1726776694.97726: stdout chunk (state=2): >>><<< 11262 1726776694.97741: _low_level_execute_command() done: rc=0, stdout=, stderr= 11262 1726776694.97749: handler run complete 11262 1726776694.97771: attempt loop complete, returning result 11262 1726776694.97775: _execute() done 11262 1726776694.97778: dumping result to json 11262 1726776694.97783: done dumping result, returning 11262 1726776694.97791: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-cec2-986e-0000000009fe] 11262 1726776694.97797: sending task result for task 120fa90a-8a95-cec2-986e-0000000009fe 11262 1726776694.97831: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009fe 11262 1726776694.97835: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8218 1726776694.97998: no more pending results, returning what we have 8218 1726776694.98001: results queue empty 8218 1726776694.98002: checking for any_errors_fatal 8218 1726776694.98010: done checking for any_errors_fatal 8218 1726776694.98011: checking for max_fail_percentage 8218 1726776694.98012: done checking for max_fail_percentage 8218 1726776694.98013: checking to see if all hosts have failed and the running result is not ok 8218 1726776694.98014: done checking to see if all hosts have failed 8218 1726776694.98014: getting the remaining hosts for this loop 8218 1726776694.98015: done getting the remaining hosts for this loop 8218 1726776694.98018: getting the next task for host managed_node2 8218 1726776694.98023: done getting next task for host managed_node2 8218 1726776694.98026: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8218 1726776694.98030: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776694.98040: getting variables 8218 1726776694.98041: in VariableManager get_vars() 8218 1726776694.98076: Calling all_inventory to load vars for managed_node2 8218 1726776694.98079: Calling groups_inventory to load vars for managed_node2 8218 1726776694.98081: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776694.98088: Calling all_plugins_play to load vars for managed_node2 8218 1726776694.98090: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776694.98092: Calling groups_plugins_play to load vars for managed_node2 8218 1726776694.98200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776694.98320: done with get_vars() 8218 1726776694.98330: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:11:34 -0400 (0:00:00.567) 0:01:20.814 **** 8218 1726776694.98392: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776694.98553: worker is 1 (out of 1 available) 8218 1726776694.98570: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776694.98581: done queuing things up, now waiting for results queue to drain 8218 1726776694.98584: waiting for pending results... 11277 1726776694.98710: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 11277 1726776694.98816: in run() - task 120fa90a-8a95-cec2-986e-0000000009ff 11277 1726776694.98834: variable 'ansible_search_path' from source: unknown 11277 1726776694.98837: variable 'ansible_search_path' from source: unknown 11277 1726776694.98863: calling self._execute() 11277 1726776694.98935: variable 'ansible_host' from source: host vars for 'managed_node2' 11277 1726776694.98943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11277 1726776694.98952: variable 'omit' from source: magic vars 11277 1726776694.99031: variable 'omit' from source: magic vars 11277 1726776694.99068: variable 'omit' from source: magic vars 11277 1726776694.99090: variable '__kernel_settings_profile_filename' from source: role '' all vars 11277 1726776694.99307: variable '__kernel_settings_profile_filename' from source: role '' all vars 11277 1726776694.99369: variable '__kernel_settings_profile_dir' from source: role '' all vars 11277 1726776694.99497: variable '__kernel_settings_profile_parent' from source: set_fact 11277 1726776694.99505: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11277 1726776694.99539: variable 'omit' from source: magic vars 11277 1726776694.99571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11277 1726776694.99597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11277 1726776694.99614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11277 1726776694.99627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11277 1726776694.99640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11277 1726776694.99662: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11277 1726776694.99667: variable 'ansible_host' from source: host vars for 'managed_node2' 11277 1726776694.99670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11277 1726776694.99740: Set connection var ansible_connection to ssh 11277 1726776694.99748: Set connection var ansible_pipelining to False 11277 1726776694.99754: Set connection var ansible_timeout to 10 11277 1726776694.99761: Set connection var ansible_module_compression to ZIP_DEFLATED 11277 1726776694.99766: Set connection var ansible_shell_type to sh 11277 1726776694.99771: Set connection var ansible_shell_executable to /bin/sh 11277 1726776694.99787: variable 'ansible_shell_executable' from source: unknown 11277 1726776694.99791: variable 'ansible_connection' from source: unknown 11277 1726776694.99795: variable 'ansible_module_compression' from source: unknown 11277 1726776694.99799: variable 'ansible_shell_type' from source: unknown 11277 1726776694.99802: variable 'ansible_shell_executable' from source: unknown 11277 1726776694.99805: variable 'ansible_host' from source: host vars for 'managed_node2' 11277 1726776694.99810: variable 'ansible_pipelining' from source: unknown 11277 1726776694.99813: variable 'ansible_timeout' from source: unknown 11277 1726776694.99817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11277 1726776694.99943: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11277 1726776694.99955: variable 'omit' from source: magic vars 11277 1726776694.99961: starting attempt loop 11277 1726776694.99964: running the handler 11277 1726776694.99977: _low_level_execute_command(): starting 11277 1726776694.99985: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11277 1726776695.02304: stdout chunk (state=2): >>>/root <<< 11277 1726776695.02425: stderr chunk (state=3): >>><<< 11277 1726776695.02433: stdout chunk (state=3): >>><<< 11277 1726776695.02450: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11277 1726776695.02464: _low_level_execute_command(): starting 11277 1726776695.02470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228 `" && echo ansible-tmp-1726776695.0245867-11277-47012029270228="` echo /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228 `" ) && sleep 0' 11277 1726776695.05148: stdout chunk (state=2): >>>ansible-tmp-1726776695.0245867-11277-47012029270228=/root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228 <<< 11277 1726776695.05281: stderr chunk (state=3): >>><<< 11277 1726776695.05288: stdout chunk (state=3): >>><<< 11277 1726776695.05303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776695.0245867-11277-47012029270228=/root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228 , stderr= 11277 1726776695.05341: variable 'ansible_module_compression' from source: unknown 11277 1726776695.05376: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11277 1726776695.05406: variable 'ansible_facts' from source: unknown 11277 1726776695.05474: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228/AnsiballZ_kernel_settings_get_config.py 11277 1726776695.05574: Sending initial data 11277 1726776695.05581: Sent initial data (173 bytes) 11277 1726776695.08073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmppx47xiwp /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228/AnsiballZ_kernel_settings_get_config.py <<< 11277 1726776695.09130: stderr chunk (state=3): >>><<< 11277 1726776695.09136: stdout chunk (state=3): >>><<< 11277 1726776695.09154: done transferring module to remote 11277 1726776695.09164: _low_level_execute_command(): starting 11277 1726776695.09170: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228/ /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11277 1726776695.11568: stderr chunk (state=2): >>><<< 11277 1726776695.11576: stdout chunk (state=2): >>><<< 11277 1726776695.11588: _low_level_execute_command() done: rc=0, stdout=, stderr= 11277 1726776695.11592: _low_level_execute_command(): starting 11277 1726776695.11597: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11277 1726776695.27542: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 11277 1726776695.28649: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11277 1726776695.28698: stderr chunk (state=3): >>><<< 11277 1726776695.28704: stdout chunk (state=3): >>><<< 11277 1726776695.28723: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 11277 1726776695.28751: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11277 1726776695.28762: _low_level_execute_command(): starting 11277 1726776695.28769: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776695.0245867-11277-47012029270228/ > /dev/null 2>&1 && sleep 0' 11277 1726776695.31193: stderr chunk (state=2): >>><<< 11277 1726776695.31203: stdout chunk (state=2): >>><<< 11277 1726776695.31217: _low_level_execute_command() done: rc=0, stdout=, stderr= 11277 1726776695.31224: handler run complete 11277 1726776695.31241: attempt loop complete, returning result 11277 1726776695.31245: _execute() done 11277 1726776695.31249: dumping result to json 11277 1726776695.31253: done dumping result, returning 11277 1726776695.31261: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-cec2-986e-0000000009ff] 11277 1726776695.31267: sending task result for task 120fa90a-8a95-cec2-986e-0000000009ff 11277 1726776695.31300: done sending task result for task 120fa90a-8a95-cec2-986e-0000000009ff 11277 1726776695.31304: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8218 1726776695.31482: no more pending results, returning what we have 8218 1726776695.31485: results queue empty 8218 1726776695.31486: checking for any_errors_fatal 8218 1726776695.31493: done checking for any_errors_fatal 8218 1726776695.31493: checking for max_fail_percentage 8218 1726776695.31495: done checking for max_fail_percentage 8218 1726776695.31495: checking to see if all hosts have failed and the running result is not ok 8218 1726776695.31496: done checking to see if all hosts have failed 8218 1726776695.31497: getting the remaining hosts for this loop 8218 1726776695.31498: done getting the remaining hosts for this loop 8218 1726776695.31501: getting the next task for host managed_node2 8218 1726776695.31506: done getting next task for host managed_node2 8218 1726776695.31509: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8218 1726776695.31512: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776695.31521: getting variables 8218 1726776695.31523: in VariableManager get_vars() 8218 1726776695.31557: Calling all_inventory to load vars for managed_node2 8218 1726776695.31559: Calling groups_inventory to load vars for managed_node2 8218 1726776695.31560: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776695.31569: Calling all_plugins_play to load vars for managed_node2 8218 1726776695.31571: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776695.31573: Calling groups_plugins_play to load vars for managed_node2 8218 1726776695.31723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776695.31844: done with get_vars() 8218 1726776695.31852: done getting variables 8218 1726776695.31897: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:11:35 -0400 (0:00:00.335) 0:01:21.149 **** 8218 1726776695.31920: entering _queue_task() for managed_node2/template 8218 1726776695.32090: worker is 1 (out of 1 available) 8218 1726776695.32104: exiting _queue_task() for managed_node2/template 8218 1726776695.32117: done queuing things up, now waiting for results queue to drain 8218 1726776695.32119: waiting for pending results... 11285 1726776695.32249: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 11285 1726776695.32364: in run() - task 120fa90a-8a95-cec2-986e-000000000a00 11285 1726776695.32382: variable 'ansible_search_path' from source: unknown 11285 1726776695.32386: variable 'ansible_search_path' from source: unknown 11285 1726776695.32413: calling self._execute() 11285 1726776695.32483: variable 'ansible_host' from source: host vars for 'managed_node2' 11285 1726776695.32492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11285 1726776695.32501: variable 'omit' from source: magic vars 11285 1726776695.32582: variable 'omit' from source: magic vars 11285 1726776695.32619: variable 'omit' from source: magic vars 11285 1726776695.32862: variable '__kernel_settings_profile_src' from source: role '' all vars 11285 1726776695.32871: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11285 1726776695.32930: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11285 1726776695.32951: variable '__kernel_settings_profile_filename' from source: role '' all vars 11285 1726776695.32998: variable '__kernel_settings_profile_filename' from source: role '' all vars 11285 1726776695.33049: variable '__kernel_settings_profile_dir' from source: role '' all vars 11285 1726776695.33109: variable '__kernel_settings_profile_parent' from source: set_fact 11285 1726776695.33118: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11285 1726776695.33145: variable 'omit' from source: magic vars 11285 1726776695.33179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11285 1726776695.33205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11285 1726776695.33223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11285 1726776695.33243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11285 1726776695.33255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11285 1726776695.33278: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11285 1726776695.33283: variable 'ansible_host' from source: host vars for 'managed_node2' 11285 1726776695.33288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11285 1726776695.33356: Set connection var ansible_connection to ssh 11285 1726776695.33364: Set connection var ansible_pipelining to False 11285 1726776695.33371: Set connection var ansible_timeout to 10 11285 1726776695.33378: Set connection var ansible_module_compression to ZIP_DEFLATED 11285 1726776695.33383: Set connection var ansible_shell_type to sh 11285 1726776695.33388: Set connection var ansible_shell_executable to /bin/sh 11285 1726776695.33404: variable 'ansible_shell_executable' from source: unknown 11285 1726776695.33408: variable 'ansible_connection' from source: unknown 11285 1726776695.33412: variable 'ansible_module_compression' from source: unknown 11285 1726776695.33416: variable 'ansible_shell_type' from source: unknown 11285 1726776695.33419: variable 'ansible_shell_executable' from source: unknown 11285 1726776695.33424: variable 'ansible_host' from source: host vars for 'managed_node2' 11285 1726776695.33428: variable 'ansible_pipelining' from source: unknown 11285 1726776695.33433: variable 'ansible_timeout' from source: unknown 11285 1726776695.33436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11285 1726776695.33525: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11285 1726776695.33539: variable 'omit' from source: magic vars 11285 1726776695.33546: starting attempt loop 11285 1726776695.33550: running the handler 11285 1726776695.33560: _low_level_execute_command(): starting 11285 1726776695.33568: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11285 1726776695.35919: stdout chunk (state=2): >>>/root <<< 11285 1726776695.36046: stderr chunk (state=3): >>><<< 11285 1726776695.36053: stdout chunk (state=3): >>><<< 11285 1726776695.36071: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11285 1726776695.36084: _low_level_execute_command(): starting 11285 1726776695.36090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201 `" && echo ansible-tmp-1726776695.360791-11285-262270139549201="` echo /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201 `" ) && sleep 0' 11285 1726776695.39016: stdout chunk (state=2): >>>ansible-tmp-1726776695.360791-11285-262270139549201=/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201 <<< 11285 1726776695.39147: stderr chunk (state=3): >>><<< 11285 1726776695.39154: stdout chunk (state=3): >>><<< 11285 1726776695.39170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776695.360791-11285-262270139549201=/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201 , stderr= 11285 1726776695.39186: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 11285 1726776695.39204: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 11285 1726776695.39227: variable 'ansible_search_path' from source: unknown 11285 1726776695.39806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11285 1726776695.41232: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11285 1726776695.41286: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11285 1726776695.41314: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11285 1726776695.41341: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11285 1726776695.41361: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11285 1726776695.41541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11285 1726776695.41561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11285 1726776695.41584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11285 1726776695.41612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11285 1726776695.41623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11285 1726776695.41841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11285 1726776695.41859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11285 1726776695.41877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11285 1726776695.41902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11285 1726776695.41913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11285 1726776695.42156: variable 'ansible_managed' from source: unknown 11285 1726776695.42164: variable '__sections' from source: task vars 11285 1726776695.42246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11285 1726776695.42263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11285 1726776695.42282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11285 1726776695.42307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11285 1726776695.42317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11285 1726776695.42388: variable 'kernel_settings_sysctl' from source: include params 11285 1726776695.42397: variable '__kernel_settings_state_empty' from source: role '' all vars 11285 1726776695.42403: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11285 1726776695.42430: variable '__sysctl_old' from source: task vars 11285 1726776695.42474: variable '__sysctl_old' from source: task vars 11285 1726776695.42608: variable 'kernel_settings_purge' from source: role '' defaults 11285 1726776695.42615: variable 'kernel_settings_sysctl' from source: include params 11285 1726776695.42622: variable '__kernel_settings_state_empty' from source: role '' all vars 11285 1726776695.42627: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11285 1726776695.42634: variable '__kernel_settings_profile_contents' from source: set_fact 11285 1726776695.42758: variable 'kernel_settings_sysfs' from source: include params 11285 1726776695.42765: variable '__kernel_settings_state_empty' from source: role '' all vars 11285 1726776695.42771: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11285 1726776695.42782: variable '__sysfs_old' from source: task vars 11285 1726776695.42823: variable '__sysfs_old' from source: task vars 11285 1726776695.42955: variable 'kernel_settings_purge' from source: role '' defaults 11285 1726776695.42962: variable 'kernel_settings_sysfs' from source: include params 11285 1726776695.42968: variable '__kernel_settings_state_empty' from source: role '' all vars 11285 1726776695.42973: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11285 1726776695.42978: variable '__kernel_settings_profile_contents' from source: set_fact 11285 1726776695.42994: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 11285 1726776695.43002: variable '__systemd_old' from source: task vars 11285 1726776695.43045: variable '__systemd_old' from source: task vars 11285 1726776695.43170: variable 'kernel_settings_purge' from source: role '' defaults 11285 1726776695.43177: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 11285 1726776695.43182: variable '__kernel_settings_state_absent' from source: role '' all vars 11285 1726776695.43188: variable '__kernel_settings_profile_contents' from source: set_fact 11285 1726776695.43202: variable 'kernel_settings_transparent_hugepages' from source: include params 11285 1726776695.43249: variable 'kernel_settings_transparent_hugepages' from source: include params 11285 1726776695.43258: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 11285 1726776695.43263: variable '__trans_huge_old' from source: task vars 11285 1726776695.43303: variable '__trans_huge_old' from source: task vars 11285 1726776695.43427: variable 'kernel_settings_purge' from source: role '' defaults 11285 1726776695.43435: variable 'kernel_settings_transparent_hugepages' from source: include params 11285 1726776695.43440: variable '__kernel_settings_state_absent' from source: role '' all vars 11285 1726776695.43446: variable '__kernel_settings_profile_contents' from source: set_fact 11285 1726776695.43457: variable '__trans_defrag_old' from source: task vars 11285 1726776695.43497: variable '__trans_defrag_old' from source: task vars 11285 1726776695.43620: variable 'kernel_settings_purge' from source: role '' defaults 11285 1726776695.43627: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 11285 1726776695.43634: variable '__kernel_settings_state_absent' from source: role '' all vars 11285 1726776695.43639: variable '__kernel_settings_profile_contents' from source: set_fact 11285 1726776695.43655: variable '__kernel_settings_state_absent' from source: role '' all vars 11285 1726776695.43666: variable '__kernel_settings_state_absent' from source: role '' all vars 11285 1726776695.43675: variable '__kernel_settings_state_absent' from source: role '' all vars 11285 1726776695.43688: variable '__kernel_settings_state_absent' from source: role '' all vars 11285 1726776695.44117: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11285 1726776695.44160: variable 'ansible_module_compression' from source: unknown 11285 1726776695.44199: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11285 1726776695.44217: variable 'ansible_facts' from source: unknown 11285 1726776695.44284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_stat.py 11285 1726776695.44369: Sending initial data 11285 1726776695.44375: Sent initial data (151 bytes) 11285 1726776695.46967: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp6ihce87h /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_stat.py <<< 11285 1726776695.48022: stderr chunk (state=3): >>><<< 11285 1726776695.48030: stdout chunk (state=3): >>><<< 11285 1726776695.48047: done transferring module to remote 11285 1726776695.48057: _low_level_execute_command(): starting 11285 1726776695.48062: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/ /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_stat.py && sleep 0' 11285 1726776695.50418: stderr chunk (state=2): >>><<< 11285 1726776695.50425: stdout chunk (state=2): >>><<< 11285 1726776695.50439: _low_level_execute_command() done: rc=0, stdout=, stderr= 11285 1726776695.50443: _low_level_execute_command(): starting 11285 1726776695.50448: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_stat.py && sleep 0' 11285 1726776695.67347: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 155189476, "dev": 51713, "nlink": 1, "atime": 1726776686.0715325, "mtime": 1726776685.3125296, "ctime": 1726776685.5725305, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "mimetype": "text/plain", "charset": "us-ascii", "version": "1371114626", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11285 1726776695.68507: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11285 1726776695.68556: stderr chunk (state=3): >>><<< 11285 1726776695.68562: stdout chunk (state=3): >>><<< 11285 1726776695.68582: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 155189476, "dev": 51713, "nlink": 1, "atime": 1726776686.0715325, "mtime": 1726776685.3125296, "ctime": 1726776685.5725305, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "mimetype": "text/plain", "charset": "us-ascii", "version": "1371114626", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11285 1726776695.68619: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11285 1726776695.68706: Sending initial data 11285 1726776695.68714: Sent initial data (159 bytes) 11285 1726776695.71248: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp8z4kaqp6/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source <<< 11285 1726776695.71585: stderr chunk (state=3): >>><<< 11285 1726776695.71592: stdout chunk (state=3): >>><<< 11285 1726776695.71606: _low_level_execute_command(): starting 11285 1726776695.71611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/ /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source && sleep 0' 11285 1726776695.73889: stderr chunk (state=2): >>><<< 11285 1726776695.73895: stdout chunk (state=2): >>><<< 11285 1726776695.73908: _low_level_execute_command() done: rc=0, stdout=, stderr= 11285 1726776695.73928: variable 'ansible_module_compression' from source: unknown 11285 1726776695.73962: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11285 1726776695.73982: variable 'ansible_facts' from source: unknown 11285 1726776695.74040: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_copy.py 11285 1726776695.74123: Sending initial data 11285 1726776695.74132: Sent initial data (151 bytes) 11285 1726776695.76576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp57vtmn4k /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_copy.py <<< 11285 1726776695.77656: stderr chunk (state=3): >>><<< 11285 1726776695.77662: stdout chunk (state=3): >>><<< 11285 1726776695.77680: done transferring module to remote 11285 1726776695.77689: _low_level_execute_command(): starting 11285 1726776695.77694: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/ /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_copy.py && sleep 0' 11285 1726776695.80023: stderr chunk (state=2): >>><<< 11285 1726776695.80030: stdout chunk (state=2): >>><<< 11285 1726776695.80043: _low_level_execute_command() done: rc=0, stdout=, stderr= 11285 1726776695.80047: _low_level_execute_command(): starting 11285 1726776695.80052: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/AnsiballZ_copy.py && sleep 0' 11285 1726776695.96805: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source", "md5sum": "394928e588644c456053f3dec5f7c2ba", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11285 1726776695.97958: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11285 1726776695.98009: stderr chunk (state=3): >>><<< 11285 1726776695.98016: stdout chunk (state=3): >>><<< 11285 1726776695.98034: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source", "md5sum": "394928e588644c456053f3dec5f7c2ba", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11285 1726776695.98059: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '0b586509c0bdce12a2dde058e3374dab88cf7f2c', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11285 1726776695.98091: _low_level_execute_command(): starting 11285 1726776695.98098: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/ > /dev/null 2>&1 && sleep 0' 11285 1726776696.00535: stderr chunk (state=2): >>><<< 11285 1726776696.00542: stdout chunk (state=2): >>><<< 11285 1726776696.00555: _low_level_execute_command() done: rc=0, stdout=, stderr= 11285 1726776696.00565: handler run complete 11285 1726776696.00585: attempt loop complete, returning result 11285 1726776696.00588: _execute() done 11285 1726776696.00591: dumping result to json 11285 1726776696.00597: done dumping result, returning 11285 1726776696.00605: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-cec2-986e-000000000a00] 11285 1726776696.00611: sending task result for task 120fa90a-8a95-cec2-986e-000000000a00 11285 1726776696.00656: done sending task result for task 120fa90a-8a95-cec2-986e-000000000a00 11285 1726776696.00661: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "394928e588644c456053f3dec5f7c2ba", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "src": "/root/.ansible/tmp/ansible-tmp-1726776695.360791-11285-262270139549201/source", "state": "file", "uid": 0 } 8218 1726776696.00847: no more pending results, returning what we have 8218 1726776696.00851: results queue empty 8218 1726776696.00851: checking for any_errors_fatal 8218 1726776696.00857: done checking for any_errors_fatal 8218 1726776696.00858: checking for max_fail_percentage 8218 1726776696.00860: done checking for max_fail_percentage 8218 1726776696.00860: checking to see if all hosts have failed and the running result is not ok 8218 1726776696.00861: done checking to see if all hosts have failed 8218 1726776696.00862: getting the remaining hosts for this loop 8218 1726776696.00863: done getting the remaining hosts for this loop 8218 1726776696.00868: getting the next task for host managed_node2 8218 1726776696.00873: done getting next task for host managed_node2 8218 1726776696.00876: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8218 1726776696.00879: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776696.00888: getting variables 8218 1726776696.00889: in VariableManager get_vars() 8218 1726776696.00921: Calling all_inventory to load vars for managed_node2 8218 1726776696.00923: Calling groups_inventory to load vars for managed_node2 8218 1726776696.00925: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776696.00934: Calling all_plugins_play to load vars for managed_node2 8218 1726776696.00937: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776696.00939: Calling groups_plugins_play to load vars for managed_node2 8218 1726776696.01048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776696.01170: done with get_vars() 8218 1726776696.01178: done getting variables 8218 1726776696.01221: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:11:36 -0400 (0:00:00.693) 0:01:21.843 **** 8218 1726776696.01247: entering _queue_task() for managed_node2/service 8218 1726776696.01406: worker is 1 (out of 1 available) 8218 1726776696.01421: exiting _queue_task() for managed_node2/service 8218 1726776696.01436: done queuing things up, now waiting for results queue to drain 8218 1726776696.01437: waiting for pending results... 11303 1726776696.01562: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 11303 1726776696.01675: in run() - task 120fa90a-8a95-cec2-986e-000000000a01 11303 1726776696.01691: variable 'ansible_search_path' from source: unknown 11303 1726776696.01695: variable 'ansible_search_path' from source: unknown 11303 1726776696.01731: variable '__kernel_settings_services' from source: include_vars 11303 1726776696.02040: variable '__kernel_settings_services' from source: include_vars 11303 1726776696.02094: variable 'omit' from source: magic vars 11303 1726776696.02163: variable 'ansible_host' from source: host vars for 'managed_node2' 11303 1726776696.02174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11303 1726776696.02183: variable 'omit' from source: magic vars 11303 1726776696.02361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11303 1726776696.02525: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11303 1726776696.02561: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11303 1726776696.02585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11303 1726776696.02606: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11303 1726776696.02685: variable '__kernel_settings_register_profile' from source: set_fact 11303 1726776696.02698: variable '__kernel_settings_register_mode' from source: set_fact 11303 1726776696.02715: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 11303 1726776696.02719: when evaluation is False, skipping this task 11303 1726776696.02741: variable 'item' from source: unknown 11303 1726776696.02787: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 11303 1726776696.02815: dumping result to json 11303 1726776696.02820: done dumping result, returning 11303 1726776696.02826: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-cec2-986e-000000000a01] 11303 1726776696.02835: sending task result for task 120fa90a-8a95-cec2-986e-000000000a01 11303 1726776696.02857: done sending task result for task 120fa90a-8a95-cec2-986e-000000000a01 11303 1726776696.02860: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8218 1726776696.03014: no more pending results, returning what we have 8218 1726776696.03017: results queue empty 8218 1726776696.03017: checking for any_errors_fatal 8218 1726776696.03027: done checking for any_errors_fatal 8218 1726776696.03030: checking for max_fail_percentage 8218 1726776696.03031: done checking for max_fail_percentage 8218 1726776696.03032: checking to see if all hosts have failed and the running result is not ok 8218 1726776696.03033: done checking to see if all hosts have failed 8218 1726776696.03033: getting the remaining hosts for this loop 8218 1726776696.03034: done getting the remaining hosts for this loop 8218 1726776696.03037: getting the next task for host managed_node2 8218 1726776696.03043: done getting next task for host managed_node2 8218 1726776696.03046: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8218 1726776696.03048: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776696.03059: getting variables 8218 1726776696.03060: in VariableManager get_vars() 8218 1726776696.03084: Calling all_inventory to load vars for managed_node2 8218 1726776696.03086: Calling groups_inventory to load vars for managed_node2 8218 1726776696.03087: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776696.03093: Calling all_plugins_play to load vars for managed_node2 8218 1726776696.03095: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776696.03096: Calling groups_plugins_play to load vars for managed_node2 8218 1726776696.03198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776696.03313: done with get_vars() 8218 1726776696.03321: done getting variables 8218 1726776696.03362: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:11:36 -0400 (0:00:00.021) 0:01:21.864 **** 8218 1726776696.03386: entering _queue_task() for managed_node2/command 8218 1726776696.03534: worker is 1 (out of 1 available) 8218 1726776696.03548: exiting _queue_task() for managed_node2/command 8218 1726776696.03559: done queuing things up, now waiting for results queue to drain 8218 1726776696.03560: waiting for pending results... 11304 1726776696.03684: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 11304 1726776696.03796: in run() - task 120fa90a-8a95-cec2-986e-000000000a02 11304 1726776696.03811: variable 'ansible_search_path' from source: unknown 11304 1726776696.03815: variable 'ansible_search_path' from source: unknown 11304 1726776696.03843: calling self._execute() 11304 1726776696.03904: variable 'ansible_host' from source: host vars for 'managed_node2' 11304 1726776696.03913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11304 1726776696.03922: variable 'omit' from source: magic vars 11304 1726776696.04233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11304 1726776696.04500: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11304 1726776696.04533: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11304 1726776696.04559: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11304 1726776696.04585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11304 1726776696.04666: variable '__kernel_settings_register_profile' from source: set_fact 11304 1726776696.04688: Evaluated conditional (not __kernel_settings_register_profile is changed): True 11304 1726776696.04776: variable '__kernel_settings_register_mode' from source: set_fact 11304 1726776696.04788: Evaluated conditional (not __kernel_settings_register_mode is changed): True 11304 1726776696.04862: variable '__kernel_settings_register_apply' from source: set_fact 11304 1726776696.04872: Evaluated conditional (__kernel_settings_register_apply is changed): True 11304 1726776696.04878: variable 'omit' from source: magic vars 11304 1726776696.04904: variable 'omit' from source: magic vars 11304 1726776696.04991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11304 1726776696.06375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11304 1726776696.06428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11304 1726776696.06460: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11304 1726776696.06486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11304 1726776696.06508: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11304 1726776696.06561: variable '__kernel_settings_active_profile' from source: set_fact 11304 1726776696.06590: variable 'omit' from source: magic vars 11304 1726776696.06613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11304 1726776696.06637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11304 1726776696.06652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11304 1726776696.06665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11304 1726776696.06676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11304 1726776696.06698: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11304 1726776696.06703: variable 'ansible_host' from source: host vars for 'managed_node2' 11304 1726776696.06708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11304 1726776696.06773: Set connection var ansible_connection to ssh 11304 1726776696.06782: Set connection var ansible_pipelining to False 11304 1726776696.06788: Set connection var ansible_timeout to 10 11304 1726776696.06795: Set connection var ansible_module_compression to ZIP_DEFLATED 11304 1726776696.06800: Set connection var ansible_shell_type to sh 11304 1726776696.06806: Set connection var ansible_shell_executable to /bin/sh 11304 1726776696.06822: variable 'ansible_shell_executable' from source: unknown 11304 1726776696.06827: variable 'ansible_connection' from source: unknown 11304 1726776696.06833: variable 'ansible_module_compression' from source: unknown 11304 1726776696.06836: variable 'ansible_shell_type' from source: unknown 11304 1726776696.06839: variable 'ansible_shell_executable' from source: unknown 11304 1726776696.06843: variable 'ansible_host' from source: host vars for 'managed_node2' 11304 1726776696.06847: variable 'ansible_pipelining' from source: unknown 11304 1726776696.06850: variable 'ansible_timeout' from source: unknown 11304 1726776696.06855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11304 1726776696.06919: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11304 1726776696.06933: variable 'omit' from source: magic vars 11304 1726776696.06939: starting attempt loop 11304 1726776696.06942: running the handler 11304 1726776696.06954: _low_level_execute_command(): starting 11304 1726776696.06961: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11304 1726776696.09269: stdout chunk (state=2): >>>/root <<< 11304 1726776696.09391: stderr chunk (state=3): >>><<< 11304 1726776696.09399: stdout chunk (state=3): >>><<< 11304 1726776696.09417: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11304 1726776696.09430: _low_level_execute_command(): starting 11304 1726776696.09436: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294 `" && echo ansible-tmp-1726776696.0942483-11304-188548534562294="` echo /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294 `" ) && sleep 0' 11304 1726776696.11985: stdout chunk (state=2): >>>ansible-tmp-1726776696.0942483-11304-188548534562294=/root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294 <<< 11304 1726776696.12113: stderr chunk (state=3): >>><<< 11304 1726776696.12119: stdout chunk (state=3): >>><<< 11304 1726776696.12135: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776696.0942483-11304-188548534562294=/root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294 , stderr= 11304 1726776696.12158: variable 'ansible_module_compression' from source: unknown 11304 1726776696.12195: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11304 1726776696.12223: variable 'ansible_facts' from source: unknown 11304 1726776696.12295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294/AnsiballZ_command.py 11304 1726776696.12391: Sending initial data 11304 1726776696.12398: Sent initial data (155 bytes) 11304 1726776696.14889: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpizme27vb /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294/AnsiballZ_command.py <<< 11304 1726776696.15950: stderr chunk (state=3): >>><<< 11304 1726776696.15956: stdout chunk (state=3): >>><<< 11304 1726776696.15975: done transferring module to remote 11304 1726776696.15986: _low_level_execute_command(): starting 11304 1726776696.15992: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294/ /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294/AnsiballZ_command.py && sleep 0' 11304 1726776696.18298: stderr chunk (state=2): >>><<< 11304 1726776696.18305: stdout chunk (state=2): >>><<< 11304 1726776696.18317: _low_level_execute_command() done: rc=0, stdout=, stderr= 11304 1726776696.18321: _low_level_execute_command(): starting 11304 1726776696.18326: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294/AnsiballZ_command.py && sleep 0' 11304 1726776697.49943: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:36.331268", "end": "2024-09-19 16:11:37.497552", "delta": "0:00:01.166284", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11304 1726776697.51172: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11304 1726776697.51221: stderr chunk (state=3): >>><<< 11304 1726776697.51230: stdout chunk (state=3): >>><<< 11304 1726776697.51247: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:36.331268", "end": "2024-09-19 16:11:37.497552", "delta": "0:00:01.166284", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11304 1726776697.51276: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11304 1726776697.51286: _low_level_execute_command(): starting 11304 1726776697.51292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776696.0942483-11304-188548534562294/ > /dev/null 2>&1 && sleep 0' 11304 1726776697.53721: stderr chunk (state=2): >>><<< 11304 1726776697.53730: stdout chunk (state=2): >>><<< 11304 1726776697.53745: _low_level_execute_command() done: rc=0, stdout=, stderr= 11304 1726776697.53752: handler run complete 11304 1726776697.53772: Evaluated conditional (True): True 11304 1726776697.53782: attempt loop complete, returning result 11304 1726776697.53786: _execute() done 11304 1726776697.53789: dumping result to json 11304 1726776697.53795: done dumping result, returning 11304 1726776697.53802: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-cec2-986e-000000000a02] 11304 1726776697.53808: sending task result for task 120fa90a-8a95-cec2-986e-000000000a02 11304 1726776697.53841: done sending task result for task 120fa90a-8a95-cec2-986e-000000000a02 11304 1726776697.53845: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.166284", "end": "2024-09-19 16:11:37.497552", "rc": 0, "start": "2024-09-19 16:11:36.331268" } 8218 1726776697.54008: no more pending results, returning what we have 8218 1726776697.54011: results queue empty 8218 1726776697.54012: checking for any_errors_fatal 8218 1726776697.54020: done checking for any_errors_fatal 8218 1726776697.54021: checking for max_fail_percentage 8218 1726776697.54022: done checking for max_fail_percentage 8218 1726776697.54023: checking to see if all hosts have failed and the running result is not ok 8218 1726776697.54024: done checking to see if all hosts have failed 8218 1726776697.54024: getting the remaining hosts for this loop 8218 1726776697.54026: done getting the remaining hosts for this loop 8218 1726776697.54031: getting the next task for host managed_node2 8218 1726776697.54037: done getting next task for host managed_node2 8218 1726776697.54041: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8218 1726776697.54043: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776697.54053: getting variables 8218 1726776697.54054: in VariableManager get_vars() 8218 1726776697.54087: Calling all_inventory to load vars for managed_node2 8218 1726776697.54090: Calling groups_inventory to load vars for managed_node2 8218 1726776697.54092: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776697.54100: Calling all_plugins_play to load vars for managed_node2 8218 1726776697.54102: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776697.54104: Calling groups_plugins_play to load vars for managed_node2 8218 1726776697.54286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776697.54404: done with get_vars() 8218 1726776697.54412: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:11:37 -0400 (0:00:01.510) 0:01:23.375 **** 8218 1726776697.54484: entering _queue_task() for managed_node2/include_tasks 8218 1726776697.54644: worker is 1 (out of 1 available) 8218 1726776697.54659: exiting _queue_task() for managed_node2/include_tasks 8218 1726776697.54672: done queuing things up, now waiting for results queue to drain 8218 1726776697.54673: waiting for pending results... 11315 1726776697.54807: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11315 1726776697.54924: in run() - task 120fa90a-8a95-cec2-986e-000000000a03 11315 1726776697.54943: variable 'ansible_search_path' from source: unknown 11315 1726776697.54947: variable 'ansible_search_path' from source: unknown 11315 1726776697.54975: calling self._execute() 11315 1726776697.55044: variable 'ansible_host' from source: host vars for 'managed_node2' 11315 1726776697.55053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11315 1726776697.55065: variable 'omit' from source: magic vars 11315 1726776697.55388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11315 1726776697.55578: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11315 1726776697.55611: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11315 1726776697.55640: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11315 1726776697.55670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11315 1726776697.55751: variable '__kernel_settings_register_apply' from source: set_fact 11315 1726776697.55777: Evaluated conditional (__kernel_settings_register_apply is changed): True 11315 1726776697.55784: _execute() done 11315 1726776697.55788: dumping result to json 11315 1726776697.55792: done dumping result, returning 11315 1726776697.55798: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-cec2-986e-000000000a03] 11315 1726776697.55805: sending task result for task 120fa90a-8a95-cec2-986e-000000000a03 11315 1726776697.55830: done sending task result for task 120fa90a-8a95-cec2-986e-000000000a03 11315 1726776697.55834: WORKER PROCESS EXITING 8218 1726776697.55933: no more pending results, returning what we have 8218 1726776697.55937: in VariableManager get_vars() 8218 1726776697.55975: Calling all_inventory to load vars for managed_node2 8218 1726776697.55978: Calling groups_inventory to load vars for managed_node2 8218 1726776697.55980: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776697.55988: Calling all_plugins_play to load vars for managed_node2 8218 1726776697.55991: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776697.55993: Calling groups_plugins_play to load vars for managed_node2 8218 1726776697.56110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776697.56224: done with get_vars() 8218 1726776697.56231: variable 'ansible_search_path' from source: unknown 8218 1726776697.56232: variable 'ansible_search_path' from source: unknown 8218 1726776697.56255: we have included files to process 8218 1726776697.56255: generating all_blocks data 8218 1726776697.56259: done generating all_blocks data 8218 1726776697.56263: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776697.56263: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776697.56265: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8218 1726776697.56517: done processing included file 8218 1726776697.56520: iterating over new_blocks loaded from include file 8218 1726776697.56520: in VariableManager get_vars() 8218 1726776697.56538: done with get_vars() 8218 1726776697.56539: filtering new block on tags 8218 1726776697.56573: done filtering new block on tags 8218 1726776697.56574: done iterating over new_blocks loaded from include file 8218 1726776697.56575: extending task lists for all hosts with included blocks 8218 1726776697.56968: done extending task lists 8218 1726776697.56970: done processing included files 8218 1726776697.56970: results queue empty 8218 1726776697.56970: checking for any_errors_fatal 8218 1726776697.56973: done checking for any_errors_fatal 8218 1726776697.56974: checking for max_fail_percentage 8218 1726776697.56974: done checking for max_fail_percentage 8218 1726776697.56975: checking to see if all hosts have failed and the running result is not ok 8218 1726776697.56975: done checking to see if all hosts have failed 8218 1726776697.56976: getting the remaining hosts for this loop 8218 1726776697.56976: done getting the remaining hosts for this loop 8218 1726776697.56978: getting the next task for host managed_node2 8218 1726776697.56981: done getting next task for host managed_node2 8218 1726776697.56982: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8218 1726776697.56984: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776697.56990: getting variables 8218 1726776697.56991: in VariableManager get_vars() 8218 1726776697.57000: Calling all_inventory to load vars for managed_node2 8218 1726776697.57001: Calling groups_inventory to load vars for managed_node2 8218 1726776697.57002: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776697.57005: Calling all_plugins_play to load vars for managed_node2 8218 1726776697.57006: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776697.57008: Calling groups_plugins_play to load vars for managed_node2 8218 1726776697.57085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776697.57195: done with get_vars() 8218 1726776697.57201: done getting variables 8218 1726776697.57226: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 16:11:37 -0400 (0:00:00.027) 0:01:23.403 **** 8218 1726776697.57249: entering _queue_task() for managed_node2/command 8218 1726776697.57401: worker is 1 (out of 1 available) 8218 1726776697.57415: exiting _queue_task() for managed_node2/command 8218 1726776697.57427: done queuing things up, now waiting for results queue to drain 8218 1726776697.57430: waiting for pending results... 11316 1726776697.57555: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11316 1726776697.57677: in run() - task 120fa90a-8a95-cec2-986e-000000000c42 11316 1726776697.57692: variable 'ansible_search_path' from source: unknown 11316 1726776697.57696: variable 'ansible_search_path' from source: unknown 11316 1726776697.57722: calling self._execute() 11316 1726776697.57788: variable 'ansible_host' from source: host vars for 'managed_node2' 11316 1726776697.57798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11316 1726776697.57806: variable 'omit' from source: magic vars 11316 1726776697.57888: variable 'omit' from source: magic vars 11316 1726776697.57932: variable 'omit' from source: magic vars 11316 1726776697.57955: variable 'omit' from source: magic vars 11316 1726776697.57988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11316 1726776697.58013: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11316 1726776697.58035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11316 1726776697.58049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11316 1726776697.58060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11316 1726776697.58083: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11316 1726776697.58089: variable 'ansible_host' from source: host vars for 'managed_node2' 11316 1726776697.58094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11316 1726776697.58162: Set connection var ansible_connection to ssh 11316 1726776697.58171: Set connection var ansible_pipelining to False 11316 1726776697.58177: Set connection var ansible_timeout to 10 11316 1726776697.58185: Set connection var ansible_module_compression to ZIP_DEFLATED 11316 1726776697.58190: Set connection var ansible_shell_type to sh 11316 1726776697.58195: Set connection var ansible_shell_executable to /bin/sh 11316 1726776697.58211: variable 'ansible_shell_executable' from source: unknown 11316 1726776697.58215: variable 'ansible_connection' from source: unknown 11316 1726776697.58219: variable 'ansible_module_compression' from source: unknown 11316 1726776697.58223: variable 'ansible_shell_type' from source: unknown 11316 1726776697.58226: variable 'ansible_shell_executable' from source: unknown 11316 1726776697.58231: variable 'ansible_host' from source: host vars for 'managed_node2' 11316 1726776697.58236: variable 'ansible_pipelining' from source: unknown 11316 1726776697.58239: variable 'ansible_timeout' from source: unknown 11316 1726776697.58243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11316 1726776697.58335: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11316 1726776697.58346: variable 'omit' from source: magic vars 11316 1726776697.58352: starting attempt loop 11316 1726776697.58355: running the handler 11316 1726776697.58369: _low_level_execute_command(): starting 11316 1726776697.58377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11316 1726776697.60718: stdout chunk (state=2): >>>/root <<< 11316 1726776697.60839: stderr chunk (state=3): >>><<< 11316 1726776697.60845: stdout chunk (state=3): >>><<< 11316 1726776697.60867: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11316 1726776697.60880: _low_level_execute_command(): starting 11316 1726776697.60886: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586 `" && echo ansible-tmp-1726776697.608754-11316-26916899419586="` echo /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586 `" ) && sleep 0' 11316 1726776697.63571: stdout chunk (state=2): >>>ansible-tmp-1726776697.608754-11316-26916899419586=/root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586 <<< 11316 1726776697.63697: stderr chunk (state=3): >>><<< 11316 1726776697.63704: stdout chunk (state=3): >>><<< 11316 1726776697.63717: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776697.608754-11316-26916899419586=/root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586 , stderr= 11316 1726776697.63742: variable 'ansible_module_compression' from source: unknown 11316 1726776697.63788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11316 1726776697.63818: variable 'ansible_facts' from source: unknown 11316 1726776697.63895: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586/AnsiballZ_command.py 11316 1726776697.63993: Sending initial data 11316 1726776697.64000: Sent initial data (153 bytes) 11316 1726776697.66470: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpruvbhfbu /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586/AnsiballZ_command.py <<< 11316 1726776697.67513: stderr chunk (state=3): >>><<< 11316 1726776697.67520: stdout chunk (state=3): >>><<< 11316 1726776697.67539: done transferring module to remote 11316 1726776697.67549: _low_level_execute_command(): starting 11316 1726776697.67554: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586/ /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586/AnsiballZ_command.py && sleep 0' 11316 1726776697.69907: stderr chunk (state=2): >>><<< 11316 1726776697.69913: stdout chunk (state=2): >>><<< 11316 1726776697.69925: _low_level_execute_command() done: rc=0, stdout=, stderr= 11316 1726776697.69930: _low_level_execute_command(): starting 11316 1726776697.69935: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586/AnsiballZ_command.py && sleep 0' 11316 1726776697.95363: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:37.848616", "end": "2024-09-19 16:11:37.951788", "delta": "0:00:00.103172", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11316 1726776697.96569: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11316 1726776697.96614: stderr chunk (state=3): >>><<< 11316 1726776697.96621: stdout chunk (state=3): >>><<< 11316 1726776697.96640: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:37.848616", "end": "2024-09-19 16:11:37.951788", "delta": "0:00:00.103172", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11316 1726776697.96684: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11316 1726776697.96694: _low_level_execute_command(): starting 11316 1726776697.96701: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776697.608754-11316-26916899419586/ > /dev/null 2>&1 && sleep 0' 11316 1726776697.99089: stderr chunk (state=2): >>><<< 11316 1726776697.99098: stdout chunk (state=2): >>><<< 11316 1726776697.99112: _low_level_execute_command() done: rc=0, stdout=, stderr= 11316 1726776697.99119: handler run complete 11316 1726776697.99140: Evaluated conditional (False): False 11316 1726776697.99150: attempt loop complete, returning result 11316 1726776697.99153: _execute() done 11316 1726776697.99157: dumping result to json 11316 1726776697.99163: done dumping result, returning 11316 1726776697.99172: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [120fa90a-8a95-cec2-986e-000000000c42] 11316 1726776697.99179: sending task result for task 120fa90a-8a95-cec2-986e-000000000c42 11316 1726776697.99211: done sending task result for task 120fa90a-8a95-cec2-986e-000000000c42 11316 1726776697.99215: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.103172", "end": "2024-09-19 16:11:37.951788", "rc": 0, "start": "2024-09-19 16:11:37.848616" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8218 1726776697.99389: no more pending results, returning what we have 8218 1726776697.99392: results queue empty 8218 1726776697.99393: checking for any_errors_fatal 8218 1726776697.99394: done checking for any_errors_fatal 8218 1726776697.99395: checking for max_fail_percentage 8218 1726776697.99396: done checking for max_fail_percentage 8218 1726776697.99397: checking to see if all hosts have failed and the running result is not ok 8218 1726776697.99398: done checking to see if all hosts have failed 8218 1726776697.99398: getting the remaining hosts for this loop 8218 1726776697.99399: done getting the remaining hosts for this loop 8218 1726776697.99402: getting the next task for host managed_node2 8218 1726776697.99408: done getting next task for host managed_node2 8218 1726776697.99411: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8218 1726776697.99415: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776697.99426: getting variables 8218 1726776697.99427: in VariableManager get_vars() 8218 1726776697.99461: Calling all_inventory to load vars for managed_node2 8218 1726776697.99463: Calling groups_inventory to load vars for managed_node2 8218 1726776697.99465: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776697.99472: Calling all_plugins_play to load vars for managed_node2 8218 1726776697.99473: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776697.99475: Calling groups_plugins_play to load vars for managed_node2 8218 1726776697.99582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776697.99730: done with get_vars() 8218 1726776697.99739: done getting variables 8218 1726776697.99781: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 16:11:37 -0400 (0:00:00.425) 0:01:23.828 **** 8218 1726776697.99805: entering _queue_task() for managed_node2/shell 8218 1726776697.99966: worker is 1 (out of 1 available) 8218 1726776697.99981: exiting _queue_task() for managed_node2/shell 8218 1726776697.99992: done queuing things up, now waiting for results queue to drain 8218 1726776697.99993: waiting for pending results... 11327 1726776698.00125: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11327 1726776698.00252: in run() - task 120fa90a-8a95-cec2-986e-000000000c43 11327 1726776698.00270: variable 'ansible_search_path' from source: unknown 11327 1726776698.00274: variable 'ansible_search_path' from source: unknown 11327 1726776698.00301: calling self._execute() 11327 1726776698.00368: variable 'ansible_host' from source: host vars for 'managed_node2' 11327 1726776698.00377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11327 1726776698.00386: variable 'omit' from source: magic vars 11327 1726776698.00707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11327 1726776698.00891: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11327 1726776698.00925: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11327 1726776698.00951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11327 1726776698.00981: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11327 1726776698.01060: variable '__kernel_settings_register_verify_values' from source: set_fact 11327 1726776698.01085: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11327 1726776698.01091: when evaluation is False, skipping this task 11327 1726776698.01095: _execute() done 11327 1726776698.01099: dumping result to json 11327 1726776698.01103: done dumping result, returning 11327 1726776698.01109: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [120fa90a-8a95-cec2-986e-000000000c43] 11327 1726776698.01115: sending task result for task 120fa90a-8a95-cec2-986e-000000000c43 11327 1726776698.01138: done sending task result for task 120fa90a-8a95-cec2-986e-000000000c43 11327 1726776698.01142: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776698.01242: no more pending results, returning what we have 8218 1726776698.01245: results queue empty 8218 1726776698.01246: checking for any_errors_fatal 8218 1726776698.01254: done checking for any_errors_fatal 8218 1726776698.01254: checking for max_fail_percentage 8218 1726776698.01256: done checking for max_fail_percentage 8218 1726776698.01256: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.01257: done checking to see if all hosts have failed 8218 1726776698.01257: getting the remaining hosts for this loop 8218 1726776698.01258: done getting the remaining hosts for this loop 8218 1726776698.01261: getting the next task for host managed_node2 8218 1726776698.01266: done getting next task for host managed_node2 8218 1726776698.01269: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8218 1726776698.01273: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.01287: getting variables 8218 1726776698.01289: in VariableManager get_vars() 8218 1726776698.01317: Calling all_inventory to load vars for managed_node2 8218 1726776698.01320: Calling groups_inventory to load vars for managed_node2 8218 1726776698.01322: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.01331: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.01333: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.01335: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.01438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.01555: done with get_vars() 8218 1726776698.01563: done getting variables 8218 1726776698.01603: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.018) 0:01:23.846 **** 8218 1726776698.01626: entering _queue_task() for managed_node2/fail 8218 1726776698.01773: worker is 1 (out of 1 available) 8218 1726776698.01787: exiting _queue_task() for managed_node2/fail 8218 1726776698.01798: done queuing things up, now waiting for results queue to drain 8218 1726776698.01799: waiting for pending results... 11328 1726776698.01924: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11328 1726776698.02042: in run() - task 120fa90a-8a95-cec2-986e-000000000c44 11328 1726776698.02057: variable 'ansible_search_path' from source: unknown 11328 1726776698.02061: variable 'ansible_search_path' from source: unknown 11328 1726776698.02088: calling self._execute() 11328 1726776698.02153: variable 'ansible_host' from source: host vars for 'managed_node2' 11328 1726776698.02162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11328 1726776698.02172: variable 'omit' from source: magic vars 11328 1726776698.02493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11328 1726776698.02723: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11328 1726776698.02756: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11328 1726776698.02784: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11328 1726776698.02811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11328 1726776698.02886: variable '__kernel_settings_register_verify_values' from source: set_fact 11328 1726776698.02907: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11328 1726776698.02912: when evaluation is False, skipping this task 11328 1726776698.02916: _execute() done 11328 1726776698.02920: dumping result to json 11328 1726776698.02924: done dumping result, returning 11328 1726776698.02931: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [120fa90a-8a95-cec2-986e-000000000c44] 11328 1726776698.02938: sending task result for task 120fa90a-8a95-cec2-986e-000000000c44 11328 1726776698.02960: done sending task result for task 120fa90a-8a95-cec2-986e-000000000c44 11328 1726776698.02964: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776698.03063: no more pending results, returning what we have 8218 1726776698.03066: results queue empty 8218 1726776698.03067: checking for any_errors_fatal 8218 1726776698.03071: done checking for any_errors_fatal 8218 1726776698.03072: checking for max_fail_percentage 8218 1726776698.03073: done checking for max_fail_percentage 8218 1726776698.03074: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.03074: done checking to see if all hosts have failed 8218 1726776698.03075: getting the remaining hosts for this loop 8218 1726776698.03076: done getting the remaining hosts for this loop 8218 1726776698.03079: getting the next task for host managed_node2 8218 1726776698.03085: done getting next task for host managed_node2 8218 1726776698.03089: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8218 1726776698.03091: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.03106: getting variables 8218 1726776698.03107: in VariableManager get_vars() 8218 1726776698.03143: Calling all_inventory to load vars for managed_node2 8218 1726776698.03146: Calling groups_inventory to load vars for managed_node2 8218 1726776698.03148: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.03154: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.03156: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.03157: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.03256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.03403: done with get_vars() 8218 1726776698.03409: done getting variables 8218 1726776698.03451: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.018) 0:01:23.865 **** 8218 1726776698.03472: entering _queue_task() for managed_node2/set_fact 8218 1726776698.03612: worker is 1 (out of 1 available) 8218 1726776698.03626: exiting _queue_task() for managed_node2/set_fact 8218 1726776698.03639: done queuing things up, now waiting for results queue to drain 8218 1726776698.03640: waiting for pending results... 11329 1726776698.03759: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11329 1726776698.03867: in run() - task 120fa90a-8a95-cec2-986e-000000000a04 11329 1726776698.03883: variable 'ansible_search_path' from source: unknown 11329 1726776698.03887: variable 'ansible_search_path' from source: unknown 11329 1726776698.03913: calling self._execute() 11329 1726776698.03978: variable 'ansible_host' from source: host vars for 'managed_node2' 11329 1726776698.03987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11329 1726776698.03996: variable 'omit' from source: magic vars 11329 1726776698.04071: variable 'omit' from source: magic vars 11329 1726776698.04106: variable 'omit' from source: magic vars 11329 1726776698.04131: variable 'omit' from source: magic vars 11329 1726776698.04163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11329 1726776698.04193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11329 1726776698.04211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11329 1726776698.04225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11329 1726776698.04238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11329 1726776698.04261: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11329 1726776698.04269: variable 'ansible_host' from source: host vars for 'managed_node2' 11329 1726776698.04273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11329 1726776698.04340: Set connection var ansible_connection to ssh 11329 1726776698.04348: Set connection var ansible_pipelining to False 11329 1726776698.04355: Set connection var ansible_timeout to 10 11329 1726776698.04362: Set connection var ansible_module_compression to ZIP_DEFLATED 11329 1726776698.04369: Set connection var ansible_shell_type to sh 11329 1726776698.04374: Set connection var ansible_shell_executable to /bin/sh 11329 1726776698.04389: variable 'ansible_shell_executable' from source: unknown 11329 1726776698.04393: variable 'ansible_connection' from source: unknown 11329 1726776698.04397: variable 'ansible_module_compression' from source: unknown 11329 1726776698.04400: variable 'ansible_shell_type' from source: unknown 11329 1726776698.04403: variable 'ansible_shell_executable' from source: unknown 11329 1726776698.04406: variable 'ansible_host' from source: host vars for 'managed_node2' 11329 1726776698.04410: variable 'ansible_pipelining' from source: unknown 11329 1726776698.04414: variable 'ansible_timeout' from source: unknown 11329 1726776698.04418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11329 1726776698.04507: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11329 1726776698.04519: variable 'omit' from source: magic vars 11329 1726776698.04525: starting attempt loop 11329 1726776698.04530: running the handler 11329 1726776698.04540: handler run complete 11329 1726776698.04549: attempt loop complete, returning result 11329 1726776698.04552: _execute() done 11329 1726776698.04555: dumping result to json 11329 1726776698.04559: done dumping result, returning 11329 1726776698.04566: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-000000000a04] 11329 1726776698.04573: sending task result for task 120fa90a-8a95-cec2-986e-000000000a04 11329 1726776698.04593: done sending task result for task 120fa90a-8a95-cec2-986e-000000000a04 11329 1726776698.04596: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8218 1726776698.04704: no more pending results, returning what we have 8218 1726776698.04707: results queue empty 8218 1726776698.04708: checking for any_errors_fatal 8218 1726776698.04714: done checking for any_errors_fatal 8218 1726776698.04714: checking for max_fail_percentage 8218 1726776698.04715: done checking for max_fail_percentage 8218 1726776698.04716: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.04717: done checking to see if all hosts have failed 8218 1726776698.04717: getting the remaining hosts for this loop 8218 1726776698.04718: done getting the remaining hosts for this loop 8218 1726776698.04721: getting the next task for host managed_node2 8218 1726776698.04726: done getting next task for host managed_node2 8218 1726776698.04731: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8218 1726776698.04733: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.04742: getting variables 8218 1726776698.04743: in VariableManager get_vars() 8218 1726776698.04770: Calling all_inventory to load vars for managed_node2 8218 1726776698.04772: Calling groups_inventory to load vars for managed_node2 8218 1726776698.04773: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.04779: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.04781: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.04782: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.04880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.04992: done with get_vars() 8218 1726776698.04998: done getting variables 8218 1726776698.05038: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.015) 0:01:23.881 **** 8218 1726776698.05058: entering _queue_task() for managed_node2/set_fact 8218 1726776698.05195: worker is 1 (out of 1 available) 8218 1726776698.05208: exiting _queue_task() for managed_node2/set_fact 8218 1726776698.05219: done queuing things up, now waiting for results queue to drain 8218 1726776698.05221: waiting for pending results... 11330 1726776698.05340: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11330 1726776698.05446: in run() - task 120fa90a-8a95-cec2-986e-000000000a05 11330 1726776698.05462: variable 'ansible_search_path' from source: unknown 11330 1726776698.05467: variable 'ansible_search_path' from source: unknown 11330 1726776698.05492: calling self._execute() 11330 1726776698.05554: variable 'ansible_host' from source: host vars for 'managed_node2' 11330 1726776698.05563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11330 1726776698.05573: variable 'omit' from source: magic vars 11330 1726776698.05645: variable 'omit' from source: magic vars 11330 1726776698.05680: variable 'omit' from source: magic vars 11330 1726776698.05937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11330 1726776698.06163: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11330 1726776698.06196: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11330 1726776698.06223: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11330 1726776698.06250: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11330 1726776698.06345: variable '__kernel_settings_register_profile' from source: set_fact 11330 1726776698.06358: variable '__kernel_settings_register_mode' from source: set_fact 11330 1726776698.06368: variable '__kernel_settings_register_apply' from source: set_fact 11330 1726776698.06403: variable 'omit' from source: magic vars 11330 1726776698.06425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11330 1726776698.06449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11330 1726776698.06467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11330 1726776698.06480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11330 1726776698.06489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11330 1726776698.06510: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11330 1726776698.06515: variable 'ansible_host' from source: host vars for 'managed_node2' 11330 1726776698.06519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11330 1726776698.06585: Set connection var ansible_connection to ssh 11330 1726776698.06592: Set connection var ansible_pipelining to False 11330 1726776698.06599: Set connection var ansible_timeout to 10 11330 1726776698.06606: Set connection var ansible_module_compression to ZIP_DEFLATED 11330 1726776698.06611: Set connection var ansible_shell_type to sh 11330 1726776698.06617: Set connection var ansible_shell_executable to /bin/sh 11330 1726776698.06634: variable 'ansible_shell_executable' from source: unknown 11330 1726776698.06638: variable 'ansible_connection' from source: unknown 11330 1726776698.06642: variable 'ansible_module_compression' from source: unknown 11330 1726776698.06645: variable 'ansible_shell_type' from source: unknown 11330 1726776698.06648: variable 'ansible_shell_executable' from source: unknown 11330 1726776698.06651: variable 'ansible_host' from source: host vars for 'managed_node2' 11330 1726776698.06655: variable 'ansible_pipelining' from source: unknown 11330 1726776698.06659: variable 'ansible_timeout' from source: unknown 11330 1726776698.06663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11330 1726776698.06728: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11330 1726776698.06742: variable 'omit' from source: magic vars 11330 1726776698.06748: starting attempt loop 11330 1726776698.06751: running the handler 11330 1726776698.06761: handler run complete 11330 1726776698.06771: attempt loop complete, returning result 11330 1726776698.06774: _execute() done 11330 1726776698.06777: dumping result to json 11330 1726776698.06780: done dumping result, returning 11330 1726776698.06786: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-cec2-986e-000000000a05] 11330 1726776698.06792: sending task result for task 120fa90a-8a95-cec2-986e-000000000a05 11330 1726776698.06810: done sending task result for task 120fa90a-8a95-cec2-986e-000000000a05 11330 1726776698.06814: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8218 1726776698.06920: no more pending results, returning what we have 8218 1726776698.06923: results queue empty 8218 1726776698.06924: checking for any_errors_fatal 8218 1726776698.06927: done checking for any_errors_fatal 8218 1726776698.06928: checking for max_fail_percentage 8218 1726776698.06931: done checking for max_fail_percentage 8218 1726776698.06932: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.06933: done checking to see if all hosts have failed 8218 1726776698.06933: getting the remaining hosts for this loop 8218 1726776698.06934: done getting the remaining hosts for this loop 8218 1726776698.06937: getting the next task for host managed_node2 8218 1726776698.06944: done getting next task for host managed_node2 8218 1726776698.06946: ^ task is: TASK: meta (role_complete) 8218 1726776698.06948: ^ state is: HOST STATE: block=2, task=46, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.06957: getting variables 8218 1726776698.06958: in VariableManager get_vars() 8218 1726776698.06987: Calling all_inventory to load vars for managed_node2 8218 1726776698.06990: Calling groups_inventory to load vars for managed_node2 8218 1726776698.06992: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.06999: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.07001: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.07004: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.07103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.07251: done with get_vars() 8218 1726776698.07257: done getting variables 8218 1726776698.07307: done queuing things up, now waiting for results queue to drain 8218 1726776698.07308: results queue empty 8218 1726776698.07309: checking for any_errors_fatal 8218 1726776698.07311: done checking for any_errors_fatal 8218 1726776698.07312: checking for max_fail_percentage 8218 1726776698.07312: done checking for max_fail_percentage 8218 1726776698.07316: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.07316: done checking to see if all hosts have failed 8218 1726776698.07316: getting the remaining hosts for this loop 8218 1726776698.07317: done getting the remaining hosts for this loop 8218 1726776698.07318: getting the next task for host managed_node2 8218 1726776698.07320: done getting next task for host managed_node2 8218 1726776698.07322: ^ task is: TASK: meta (flush_handlers) 8218 1726776698.07322: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.07326: getting variables 8218 1726776698.07326: in VariableManager get_vars() 8218 1726776698.07336: Calling all_inventory to load vars for managed_node2 8218 1726776698.07338: Calling groups_inventory to load vars for managed_node2 8218 1726776698.07339: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.07342: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.07343: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.07344: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.07416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.07517: done with get_vars() 8218 1726776698.07523: done getting variables TASK [Force handlers] ********************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:191 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.025) 0:01:23.906 **** 8218 1726776698.07564: in VariableManager get_vars() 8218 1726776698.07571: Calling all_inventory to load vars for managed_node2 8218 1726776698.07573: Calling groups_inventory to load vars for managed_node2 8218 1726776698.07574: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.07576: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.07578: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.07579: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.07652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.07750: done with get_vars() META: triggered running handlers for managed_node2 8218 1726776698.07759: done queuing things up, now waiting for results queue to drain 8218 1726776698.07760: results queue empty 8218 1726776698.07760: checking for any_errors_fatal 8218 1726776698.07762: done checking for any_errors_fatal 8218 1726776698.07762: checking for max_fail_percentage 8218 1726776698.07763: done checking for max_fail_percentage 8218 1726776698.07763: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.07763: done checking to see if all hosts have failed 8218 1726776698.07764: getting the remaining hosts for this loop 8218 1726776698.07764: done getting the remaining hosts for this loop 8218 1726776698.07766: getting the next task for host managed_node2 8218 1726776698.07768: done getting next task for host managed_node2 8218 1726776698.07769: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8218 1726776698.07770: ^ state is: HOST STATE: block=2, task=48, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.07772: getting variables 8218 1726776698.07772: in VariableManager get_vars() 8218 1726776698.07778: Calling all_inventory to load vars for managed_node2 8218 1726776698.07780: Calling groups_inventory to load vars for managed_node2 8218 1726776698.07781: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.07784: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.07785: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.07786: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.08026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.08122: done with get_vars() 8218 1726776698.08130: done getting variables 8218 1726776698.08153: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:194 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.006) 0:01:23.912 **** 8218 1726776698.08166: entering _queue_task() for managed_node2/assert 8218 1726776698.08311: worker is 1 (out of 1 available) 8218 1726776698.08324: exiting _queue_task() for managed_node2/assert 8218 1726776698.08337: done queuing things up, now waiting for results queue to drain 8218 1726776698.08338: waiting for pending results... 11331 1726776698.08457: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 11331 1726776698.08561: in run() - task 120fa90a-8a95-cec2-986e-000000000027 11331 1726776698.08578: variable 'ansible_search_path' from source: unknown 11331 1726776698.08606: calling self._execute() 11331 1726776698.08677: variable 'ansible_host' from source: host vars for 'managed_node2' 11331 1726776698.08686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11331 1726776698.08695: variable 'omit' from source: magic vars 11331 1726776698.08771: variable 'omit' from source: magic vars 11331 1726776698.08800: variable 'omit' from source: magic vars 11331 1726776698.08822: variable 'omit' from source: magic vars 11331 1726776698.08855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11331 1726776698.08882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11331 1726776698.08901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11331 1726776698.08916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11331 1726776698.08927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11331 1726776698.08951: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11331 1726776698.08956: variable 'ansible_host' from source: host vars for 'managed_node2' 11331 1726776698.08960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11331 1726776698.09030: Set connection var ansible_connection to ssh 11331 1726776698.09038: Set connection var ansible_pipelining to False 11331 1726776698.09045: Set connection var ansible_timeout to 10 11331 1726776698.09052: Set connection var ansible_module_compression to ZIP_DEFLATED 11331 1726776698.09057: Set connection var ansible_shell_type to sh 11331 1726776698.09062: Set connection var ansible_shell_executable to /bin/sh 11331 1726776698.09080: variable 'ansible_shell_executable' from source: unknown 11331 1726776698.09084: variable 'ansible_connection' from source: unknown 11331 1726776698.09087: variable 'ansible_module_compression' from source: unknown 11331 1726776698.09091: variable 'ansible_shell_type' from source: unknown 11331 1726776698.09094: variable 'ansible_shell_executable' from source: unknown 11331 1726776698.09097: variable 'ansible_host' from source: host vars for 'managed_node2' 11331 1726776698.09102: variable 'ansible_pipelining' from source: unknown 11331 1726776698.09105: variable 'ansible_timeout' from source: unknown 11331 1726776698.09108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11331 1726776698.09195: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11331 1726776698.09205: variable 'omit' from source: magic vars 11331 1726776698.09209: starting attempt loop 11331 1726776698.09212: running the handler 11331 1726776698.09467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11331 1726776698.13784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11331 1726776698.13827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11331 1726776698.13856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11331 1726776698.13892: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11331 1726776698.13912: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11331 1726776698.13956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11331 1726776698.13979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11331 1726776698.13998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11331 1726776698.14024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11331 1726776698.14037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11331 1726776698.14107: variable 'kernel_settings_reboot_required' from source: set_fact 11331 1726776698.14121: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 11331 1726776698.14130: handler run complete 11331 1726776698.14145: attempt loop complete, returning result 11331 1726776698.14149: _execute() done 11331 1726776698.14151: dumping result to json 11331 1726776698.14155: done dumping result, returning 11331 1726776698.14161: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [120fa90a-8a95-cec2-986e-000000000027] 11331 1726776698.14167: sending task result for task 120fa90a-8a95-cec2-986e-000000000027 11331 1726776698.14188: done sending task result for task 120fa90a-8a95-cec2-986e-000000000027 11331 1726776698.14191: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776698.14299: no more pending results, returning what we have 8218 1726776698.14302: results queue empty 8218 1726776698.14303: checking for any_errors_fatal 8218 1726776698.14305: done checking for any_errors_fatal 8218 1726776698.14306: checking for max_fail_percentage 8218 1726776698.14307: done checking for max_fail_percentage 8218 1726776698.14308: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.14308: done checking to see if all hosts have failed 8218 1726776698.14309: getting the remaining hosts for this loop 8218 1726776698.14310: done getting the remaining hosts for this loop 8218 1726776698.14313: getting the next task for host managed_node2 8218 1726776698.14318: done getting next task for host managed_node2 8218 1726776698.14320: ^ task is: TASK: Ensure role reported changed 8218 1726776698.14322: ^ state is: HOST STATE: block=2, task=49, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.14324: getting variables 8218 1726776698.14326: in VariableManager get_vars() 8218 1726776698.14360: Calling all_inventory to load vars for managed_node2 8218 1726776698.14363: Calling groups_inventory to load vars for managed_node2 8218 1726776698.14365: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.14373: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.14381: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.14383: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.14541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.14658: done with get_vars() 8218 1726776698.14667: done getting variables 8218 1726776698.14709: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:198 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.065) 0:01:23.977 **** 8218 1726776698.14731: entering _queue_task() for managed_node2/assert 8218 1726776698.14892: worker is 1 (out of 1 available) 8218 1726776698.14904: exiting _queue_task() for managed_node2/assert 8218 1726776698.14915: done queuing things up, now waiting for results queue to drain 8218 1726776698.14917: waiting for pending results... 11332 1726776698.15041: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 11332 1726776698.15142: in run() - task 120fa90a-8a95-cec2-986e-000000000028 11332 1726776698.15158: variable 'ansible_search_path' from source: unknown 11332 1726776698.15186: calling self._execute() 11332 1726776698.15256: variable 'ansible_host' from source: host vars for 'managed_node2' 11332 1726776698.15268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11332 1726776698.15277: variable 'omit' from source: magic vars 11332 1726776698.15355: variable 'omit' from source: magic vars 11332 1726776698.15389: variable 'omit' from source: magic vars 11332 1726776698.15412: variable 'omit' from source: magic vars 11332 1726776698.15446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11332 1726776698.15474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11332 1726776698.15494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11332 1726776698.15508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11332 1726776698.15519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11332 1726776698.15542: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11332 1726776698.15548: variable 'ansible_host' from source: host vars for 'managed_node2' 11332 1726776698.15552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11332 1726776698.15619: Set connection var ansible_connection to ssh 11332 1726776698.15627: Set connection var ansible_pipelining to False 11332 1726776698.15636: Set connection var ansible_timeout to 10 11332 1726776698.15643: Set connection var ansible_module_compression to ZIP_DEFLATED 11332 1726776698.15648: Set connection var ansible_shell_type to sh 11332 1726776698.15654: Set connection var ansible_shell_executable to /bin/sh 11332 1726776698.15671: variable 'ansible_shell_executable' from source: unknown 11332 1726776698.15675: variable 'ansible_connection' from source: unknown 11332 1726776698.15678: variable 'ansible_module_compression' from source: unknown 11332 1726776698.15681: variable 'ansible_shell_type' from source: unknown 11332 1726776698.15684: variable 'ansible_shell_executable' from source: unknown 11332 1726776698.15688: variable 'ansible_host' from source: host vars for 'managed_node2' 11332 1726776698.15690: variable 'ansible_pipelining' from source: unknown 11332 1726776698.15692: variable 'ansible_timeout' from source: unknown 11332 1726776698.15694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11332 1726776698.15787: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11332 1726776698.15799: variable 'omit' from source: magic vars 11332 1726776698.15806: starting attempt loop 11332 1726776698.15810: running the handler 11332 1726776698.16051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11332 1726776698.20159: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11332 1726776698.20201: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11332 1726776698.20230: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11332 1726776698.20256: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11332 1726776698.20278: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11332 1726776698.20327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11332 1726776698.20351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11332 1726776698.20371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11332 1726776698.20398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11332 1726776698.20409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11332 1726776698.20471: variable '__kernel_settings_changed' from source: set_fact 11332 1726776698.20485: Evaluated conditional (__kernel_settings_changed | d(false)): True 11332 1726776698.20491: handler run complete 11332 1726776698.20505: attempt loop complete, returning result 11332 1726776698.20509: _execute() done 11332 1726776698.20511: dumping result to json 11332 1726776698.20515: done dumping result, returning 11332 1726776698.20520: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [120fa90a-8a95-cec2-986e-000000000028] 11332 1726776698.20524: sending task result for task 120fa90a-8a95-cec2-986e-000000000028 11332 1726776698.20546: done sending task result for task 120fa90a-8a95-cec2-986e-000000000028 11332 1726776698.20549: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8218 1726776698.20652: no more pending results, returning what we have 8218 1726776698.20655: results queue empty 8218 1726776698.20656: checking for any_errors_fatal 8218 1726776698.20661: done checking for any_errors_fatal 8218 1726776698.20662: checking for max_fail_percentage 8218 1726776698.20663: done checking for max_fail_percentage 8218 1726776698.20664: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.20664: done checking to see if all hosts have failed 8218 1726776698.20665: getting the remaining hosts for this loop 8218 1726776698.20666: done getting the remaining hosts for this loop 8218 1726776698.20669: getting the next task for host managed_node2 8218 1726776698.20673: done getting next task for host managed_node2 8218 1726776698.20675: ^ task is: TASK: Check sysctl 8218 1726776698.20677: ^ state is: HOST STATE: block=2, task=50, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.20680: getting variables 8218 1726776698.20681: in VariableManager get_vars() 8218 1726776698.20713: Calling all_inventory to load vars for managed_node2 8218 1726776698.20716: Calling groups_inventory to load vars for managed_node2 8218 1726776698.20718: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.20726: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.20730: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.20738: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.20897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.23437: done with get_vars() 8218 1726776698.23444: done getting variables 8218 1726776698.23476: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysctl] ************************************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:202 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.087) 0:01:24.065 **** 8218 1726776698.23491: entering _queue_task() for managed_node2/shell 8218 1726776698.23657: worker is 1 (out of 1 available) 8218 1726776698.23671: exiting _queue_task() for managed_node2/shell 8218 1726776698.23682: done queuing things up, now waiting for results queue to drain 8218 1726776698.23684: waiting for pending results... 11333 1726776698.23806: running TaskExecutor() for managed_node2/TASK: Check sysctl 11333 1726776698.23918: in run() - task 120fa90a-8a95-cec2-986e-000000000029 11333 1726776698.23935: variable 'ansible_search_path' from source: unknown 11333 1726776698.23963: calling self._execute() 11333 1726776698.24033: variable 'ansible_host' from source: host vars for 'managed_node2' 11333 1726776698.24042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11333 1726776698.24051: variable 'omit' from source: magic vars 11333 1726776698.24131: variable 'omit' from source: magic vars 11333 1726776698.24156: variable 'omit' from source: magic vars 11333 1726776698.24181: variable 'omit' from source: magic vars 11333 1726776698.24214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11333 1726776698.24243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11333 1726776698.24262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11333 1726776698.24279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11333 1726776698.24290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11333 1726776698.24313: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11333 1726776698.24318: variable 'ansible_host' from source: host vars for 'managed_node2' 11333 1726776698.24322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11333 1726776698.24393: Set connection var ansible_connection to ssh 11333 1726776698.24401: Set connection var ansible_pipelining to False 11333 1726776698.24408: Set connection var ansible_timeout to 10 11333 1726776698.24415: Set connection var ansible_module_compression to ZIP_DEFLATED 11333 1726776698.24420: Set connection var ansible_shell_type to sh 11333 1726776698.24426: Set connection var ansible_shell_executable to /bin/sh 11333 1726776698.24442: variable 'ansible_shell_executable' from source: unknown 11333 1726776698.24446: variable 'ansible_connection' from source: unknown 11333 1726776698.24449: variable 'ansible_module_compression' from source: unknown 11333 1726776698.24452: variable 'ansible_shell_type' from source: unknown 11333 1726776698.24455: variable 'ansible_shell_executable' from source: unknown 11333 1726776698.24458: variable 'ansible_host' from source: host vars for 'managed_node2' 11333 1726776698.24460: variable 'ansible_pipelining' from source: unknown 11333 1726776698.24462: variable 'ansible_timeout' from source: unknown 11333 1726776698.24464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11333 1726776698.24559: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11333 1726776698.24573: variable 'omit' from source: magic vars 11333 1726776698.24577: starting attempt loop 11333 1726776698.24579: running the handler 11333 1726776698.24586: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11333 1726776698.24600: _low_level_execute_command(): starting 11333 1726776698.24605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11333 1726776698.26953: stdout chunk (state=2): >>>/root <<< 11333 1726776698.27073: stderr chunk (state=3): >>><<< 11333 1726776698.27080: stdout chunk (state=3): >>><<< 11333 1726776698.27096: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11333 1726776698.27109: _low_level_execute_command(): starting 11333 1726776698.27115: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993 `" && echo ansible-tmp-1726776698.2710416-11333-15236047371993="` echo /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993 `" ) && sleep 0' 11333 1726776698.29630: stdout chunk (state=2): >>>ansible-tmp-1726776698.2710416-11333-15236047371993=/root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993 <<< 11333 1726776698.29751: stderr chunk (state=3): >>><<< 11333 1726776698.29757: stdout chunk (state=3): >>><<< 11333 1726776698.29772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776698.2710416-11333-15236047371993=/root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993 , stderr= 11333 1726776698.29794: variable 'ansible_module_compression' from source: unknown 11333 1726776698.29840: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11333 1726776698.29874: variable 'ansible_facts' from source: unknown 11333 1726776698.29943: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993/AnsiballZ_command.py 11333 1726776698.30035: Sending initial data 11333 1726776698.30043: Sent initial data (154 bytes) 11333 1726776698.32458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmppau24j89 /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993/AnsiballZ_command.py <<< 11333 1726776698.33510: stderr chunk (state=3): >>><<< 11333 1726776698.33516: stdout chunk (state=3): >>><<< 11333 1726776698.33535: done transferring module to remote 11333 1726776698.33545: _low_level_execute_command(): starting 11333 1726776698.33550: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993/ /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993/AnsiballZ_command.py && sleep 0' 11333 1726776698.35814: stderr chunk (state=2): >>><<< 11333 1726776698.35822: stdout chunk (state=2): >>><<< 11333 1726776698.35837: _low_level_execute_command() done: rc=0, stdout=, stderr= 11333 1726776698.35841: _low_level_execute_command(): starting 11333 1726776698.35846: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993/AnsiballZ_command.py && sleep 0' 11333 1726776698.51540: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "start": "2024-09-19 16:11:38.507974", "end": "2024-09-19 16:11:38.513761", "delta": "0:00:00.005787", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11333 1726776698.52685: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11333 1726776698.52735: stderr chunk (state=3): >>><<< 11333 1726776698.52742: stdout chunk (state=3): >>><<< 11333 1726776698.52759: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "start": "2024-09-19 16:11:38.507974", "end": "2024-09-19 16:11:38.513761", "delta": "0:00:00.005787", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11333 1726776698.52802: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11333 1726776698.52814: _low_level_execute_command(): starting 11333 1726776698.52821: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776698.2710416-11333-15236047371993/ > /dev/null 2>&1 && sleep 0' 11333 1726776698.55238: stderr chunk (state=2): >>><<< 11333 1726776698.55248: stdout chunk (state=2): >>><<< 11333 1726776698.55262: _low_level_execute_command() done: rc=0, stdout=, stderr= 11333 1726776698.55270: handler run complete 11333 1726776698.55289: Evaluated conditional (False): False 11333 1726776698.55298: attempt loop complete, returning result 11333 1726776698.55302: _execute() done 11333 1726776698.55305: dumping result to json 11333 1726776698.55310: done dumping result, returning 11333 1726776698.55317: done running TaskExecutor() for managed_node2/TASK: Check sysctl [120fa90a-8a95-cec2-986e-000000000029] 11333 1726776698.55323: sending task result for task 120fa90a-8a95-cec2-986e-000000000029 11333 1726776698.55356: done sending task result for task 120fa90a-8a95-cec2-986e-000000000029 11333 1726776698.55360: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "delta": "0:00:00.005787", "end": "2024-09-19 16:11:38.513761", "rc": 0, "start": "2024-09-19 16:11:38.507974" } 8218 1726776698.55490: no more pending results, returning what we have 8218 1726776698.55493: results queue empty 8218 1726776698.55494: checking for any_errors_fatal 8218 1726776698.55500: done checking for any_errors_fatal 8218 1726776698.55501: checking for max_fail_percentage 8218 1726776698.55502: done checking for max_fail_percentage 8218 1726776698.55503: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.55503: done checking to see if all hosts have failed 8218 1726776698.55504: getting the remaining hosts for this loop 8218 1726776698.55505: done getting the remaining hosts for this loop 8218 1726776698.55508: getting the next task for host managed_node2 8218 1726776698.55513: done getting next task for host managed_node2 8218 1726776698.55514: ^ task is: TASK: Check sysfs after role runs 8218 1726776698.55516: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.55519: getting variables 8218 1726776698.55520: in VariableManager get_vars() 8218 1726776698.55555: Calling all_inventory to load vars for managed_node2 8218 1726776698.55558: Calling groups_inventory to load vars for managed_node2 8218 1726776698.55560: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.55572: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.55574: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.55577: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.55694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.55809: done with get_vars() 8218 1726776698.55819: done getting variables 8218 1726776698.55866: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:208 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.323) 0:01:24.389 **** 8218 1726776698.55891: entering _queue_task() for managed_node2/command 8218 1726776698.56051: worker is 1 (out of 1 available) 8218 1726776698.56067: exiting _queue_task() for managed_node2/command 8218 1726776698.56077: done queuing things up, now waiting for results queue to drain 8218 1726776698.56079: waiting for pending results... 11341 1726776698.56202: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 11341 1726776698.56303: in run() - task 120fa90a-8a95-cec2-986e-00000000002a 11341 1726776698.56319: variable 'ansible_search_path' from source: unknown 11341 1726776698.56348: calling self._execute() 11341 1726776698.56417: variable 'ansible_host' from source: host vars for 'managed_node2' 11341 1726776698.56426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11341 1726776698.56436: variable 'omit' from source: magic vars 11341 1726776698.56582: variable 'omit' from source: magic vars 11341 1726776698.56614: variable 'omit' from source: magic vars 11341 1726776698.56640: variable 'omit' from source: magic vars 11341 1726776698.56674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11341 1726776698.56698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11341 1726776698.56717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11341 1726776698.56733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11341 1726776698.56744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11341 1726776698.56768: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11341 1726776698.56773: variable 'ansible_host' from source: host vars for 'managed_node2' 11341 1726776698.56778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11341 1726776698.56848: Set connection var ansible_connection to ssh 11341 1726776698.56856: Set connection var ansible_pipelining to False 11341 1726776698.56863: Set connection var ansible_timeout to 10 11341 1726776698.56870: Set connection var ansible_module_compression to ZIP_DEFLATED 11341 1726776698.56876: Set connection var ansible_shell_type to sh 11341 1726776698.56881: Set connection var ansible_shell_executable to /bin/sh 11341 1726776698.56896: variable 'ansible_shell_executable' from source: unknown 11341 1726776698.56900: variable 'ansible_connection' from source: unknown 11341 1726776698.56903: variable 'ansible_module_compression' from source: unknown 11341 1726776698.56907: variable 'ansible_shell_type' from source: unknown 11341 1726776698.56910: variable 'ansible_shell_executable' from source: unknown 11341 1726776698.56914: variable 'ansible_host' from source: host vars for 'managed_node2' 11341 1726776698.56919: variable 'ansible_pipelining' from source: unknown 11341 1726776698.56922: variable 'ansible_timeout' from source: unknown 11341 1726776698.56926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11341 1726776698.57015: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11341 1726776698.57027: variable 'omit' from source: magic vars 11341 1726776698.57035: starting attempt loop 11341 1726776698.57039: running the handler 11341 1726776698.57051: _low_level_execute_command(): starting 11341 1726776698.57059: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11341 1726776698.59375: stdout chunk (state=2): >>>/root <<< 11341 1726776698.59493: stderr chunk (state=3): >>><<< 11341 1726776698.59499: stdout chunk (state=3): >>><<< 11341 1726776698.59516: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11341 1726776698.59530: _low_level_execute_command(): starting 11341 1726776698.59538: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631 `" && echo ansible-tmp-1726776698.5952487-11341-145051994923631="` echo /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631 `" ) && sleep 0' 11341 1726776698.62115: stdout chunk (state=2): >>>ansible-tmp-1726776698.5952487-11341-145051994923631=/root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631 <<< 11341 1726776698.62243: stderr chunk (state=3): >>><<< 11341 1726776698.62249: stdout chunk (state=3): >>><<< 11341 1726776698.62262: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776698.5952487-11341-145051994923631=/root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631 , stderr= 11341 1726776698.62287: variable 'ansible_module_compression' from source: unknown 11341 1726776698.62330: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11341 1726776698.62361: variable 'ansible_facts' from source: unknown 11341 1726776698.62435: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631/AnsiballZ_command.py 11341 1726776698.62531: Sending initial data 11341 1726776698.62541: Sent initial data (155 bytes) 11341 1726776698.65038: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmprphbqtrb /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631/AnsiballZ_command.py <<< 11341 1726776698.66107: stderr chunk (state=3): >>><<< 11341 1726776698.66114: stdout chunk (state=3): >>><<< 11341 1726776698.66134: done transferring module to remote 11341 1726776698.66145: _low_level_execute_command(): starting 11341 1726776698.66150: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631/ /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631/AnsiballZ_command.py && sleep 0' 11341 1726776698.68534: stderr chunk (state=2): >>><<< 11341 1726776698.68547: stdout chunk (state=3): >>><<< 11341 1726776698.68557: _low_level_execute_command() done: rc=0, stdout=, stderr= 11341 1726776698.68561: _low_level_execute_command(): starting 11341 1726776698.68568: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631/AnsiballZ_command.py && sleep 0' 11341 1726776698.83840: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:11:38.833777", "end": "2024-09-19 16:11:38.836770", "delta": "0:00:00.002993", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -Lxqv 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11341 1726776698.84923: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11341 1726776698.84972: stderr chunk (state=3): >>><<< 11341 1726776698.84979: stdout chunk (state=3): >>><<< 11341 1726776698.84994: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 16:11:38.833777", "end": "2024-09-19 16:11:38.836770", "delta": "0:00:00.002993", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -Lxqv 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11341 1726776698.85038: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -Lxqv 60666 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11341 1726776698.85050: _low_level_execute_command(): starting 11341 1726776698.85056: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776698.5952487-11341-145051994923631/ > /dev/null 2>&1 && sleep 0' 11341 1726776698.87443: stderr chunk (state=2): >>><<< 11341 1726776698.87449: stdout chunk (state=2): >>><<< 11341 1726776698.87462: _low_level_execute_command() done: rc=0, stdout=, stderr= 11341 1726776698.87473: handler run complete 11341 1726776698.87490: Evaluated conditional (False): False 11341 1726776698.87499: attempt loop complete, returning result 11341 1726776698.87502: _execute() done 11341 1726776698.87505: dumping result to json 11341 1726776698.87511: done dumping result, returning 11341 1726776698.87517: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [120fa90a-8a95-cec2-986e-00000000002a] 11341 1726776698.87523: sending task result for task 120fa90a-8a95-cec2-986e-00000000002a 11341 1726776698.87553: done sending task result for task 120fa90a-8a95-cec2-986e-00000000002a 11341 1726776698.87557: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.002993", "end": "2024-09-19 16:11:38.836770", "rc": 0, "start": "2024-09-19 16:11:38.833777" } 8218 1726776698.87684: no more pending results, returning what we have 8218 1726776698.87687: results queue empty 8218 1726776698.87688: checking for any_errors_fatal 8218 1726776698.87695: done checking for any_errors_fatal 8218 1726776698.87696: checking for max_fail_percentage 8218 1726776698.87697: done checking for max_fail_percentage 8218 1726776698.87698: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.87699: done checking to see if all hosts have failed 8218 1726776698.87699: getting the remaining hosts for this loop 8218 1726776698.87700: done getting the remaining hosts for this loop 8218 1726776698.87703: getting the next task for host managed_node2 8218 1726776698.87710: done getting next task for host managed_node2 8218 1726776698.87712: ^ task is: TASK: Cleanup 8218 1726776698.87714: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776698.87717: getting variables 8218 1726776698.87718: in VariableManager get_vars() 8218 1726776698.87753: Calling all_inventory to load vars for managed_node2 8218 1726776698.87756: Calling groups_inventory to load vars for managed_node2 8218 1726776698.87758: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.87767: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.87769: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.87772: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.87931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.88040: done with get_vars() 8218 1726776698.88049: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:213 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.322) 0:01:24.711 **** 8218 1726776698.88114: entering _queue_task() for managed_node2/include_tasks 8218 1726776698.88267: worker is 1 (out of 1 available) 8218 1726776698.88282: exiting _queue_task() for managed_node2/include_tasks 8218 1726776698.88293: done queuing things up, now waiting for results queue to drain 8218 1726776698.88294: waiting for pending results... 11349 1726776698.88417: running TaskExecutor() for managed_node2/TASK: Cleanup 11349 1726776698.88519: in run() - task 120fa90a-8a95-cec2-986e-00000000002b 11349 1726776698.88536: variable 'ansible_search_path' from source: unknown 11349 1726776698.88564: calling self._execute() 11349 1726776698.88634: variable 'ansible_host' from source: host vars for 'managed_node2' 11349 1726776698.88645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11349 1726776698.88654: variable 'omit' from source: magic vars 11349 1726776698.88733: _execute() done 11349 1726776698.88738: dumping result to json 11349 1726776698.88742: done dumping result, returning 11349 1726776698.88748: done running TaskExecutor() for managed_node2/TASK: Cleanup [120fa90a-8a95-cec2-986e-00000000002b] 11349 1726776698.88755: sending task result for task 120fa90a-8a95-cec2-986e-00000000002b 11349 1726776698.88782: done sending task result for task 120fa90a-8a95-cec2-986e-00000000002b 11349 1726776698.88786: WORKER PROCESS EXITING 8218 1726776698.88875: no more pending results, returning what we have 8218 1726776698.88879: in VariableManager get_vars() 8218 1726776698.88914: Calling all_inventory to load vars for managed_node2 8218 1726776698.88917: Calling groups_inventory to load vars for managed_node2 8218 1726776698.88918: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.88926: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.88930: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.88933: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.89039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.89155: done with get_vars() 8218 1726776698.89160: variable 'ansible_search_path' from source: unknown 8218 1726776698.89170: we have included files to process 8218 1726776698.89171: generating all_blocks data 8218 1726776698.89172: done generating all_blocks data 8218 1726776698.89178: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8218 1726776698.89178: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8218 1726776698.89180: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node2 8218 1726776698.89796: done processing included file 8218 1726776698.89798: iterating over new_blocks loaded from include file 8218 1726776698.89799: in VariableManager get_vars() 8218 1726776698.89809: done with get_vars() 8218 1726776698.89810: filtering new block on tags 8218 1726776698.89826: done filtering new block on tags 8218 1726776698.89827: in VariableManager get_vars() 8218 1726776698.89839: done with get_vars() 8218 1726776698.89840: filtering new block on tags 8218 1726776698.89878: done filtering new block on tags 8218 1726776698.89880: done iterating over new_blocks loaded from include file 8218 1726776698.89880: extending task lists for all hosts with included blocks 8218 1726776698.92368: done extending task lists 8218 1726776698.92369: done processing included files 8218 1726776698.92369: results queue empty 8218 1726776698.92370: checking for any_errors_fatal 8218 1726776698.92372: done checking for any_errors_fatal 8218 1726776698.92373: checking for max_fail_percentage 8218 1726776698.92373: done checking for max_fail_percentage 8218 1726776698.92374: checking to see if all hosts have failed and the running result is not ok 8218 1726776698.92374: done checking to see if all hosts have failed 8218 1726776698.92374: getting the remaining hosts for this loop 8218 1726776698.92375: done getting the remaining hosts for this loop 8218 1726776698.92376: getting the next task for host managed_node2 8218 1726776698.92379: done getting next task for host managed_node2 8218 1726776698.92381: ^ task is: TASK: Show current tuned profile settings 8218 1726776698.92382: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776698.92383: getting variables 8218 1726776698.92384: in VariableManager get_vars() 8218 1726776698.92391: Calling all_inventory to load vars for managed_node2 8218 1726776698.92393: Calling groups_inventory to load vars for managed_node2 8218 1726776698.92394: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776698.92398: Calling all_plugins_play to load vars for managed_node2 8218 1726776698.92399: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776698.92400: Calling groups_plugins_play to load vars for managed_node2 8218 1726776698.92491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776698.92618: done with get_vars() 8218 1726776698.92629: done getting variables 8218 1726776698.92669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current tuned profile settings] ************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2 Thursday 19 September 2024 16:11:38 -0400 (0:00:00.045) 0:01:24.757 **** 8218 1726776698.92694: entering _queue_task() for managed_node2/command 8218 1726776698.92891: worker is 1 (out of 1 available) 8218 1726776698.92903: exiting _queue_task() for managed_node2/command 8218 1726776698.92915: done queuing things up, now waiting for results queue to drain 8218 1726776698.92917: waiting for pending results... 11352 1726776698.93142: running TaskExecutor() for managed_node2/TASK: Show current tuned profile settings 11352 1726776698.93272: in run() - task 120fa90a-8a95-cec2-986e-000000000cab 11352 1726776698.93291: variable 'ansible_search_path' from source: unknown 11352 1726776698.93295: variable 'ansible_search_path' from source: unknown 11352 1726776698.93325: calling self._execute() 11352 1726776698.93415: variable 'ansible_host' from source: host vars for 'managed_node2' 11352 1726776698.93426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11352 1726776698.93437: variable 'omit' from source: magic vars 11352 1726776698.93546: variable 'omit' from source: magic vars 11352 1726776698.93587: variable 'omit' from source: magic vars 11352 1726776698.93838: variable '__kernel_settings_profile_filename' from source: role '' exported vars 11352 1726776698.93896: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11352 1726776698.93958: variable '__kernel_settings_profile_parent' from source: set_fact 11352 1726776698.93969: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11352 1726776698.94003: variable 'omit' from source: magic vars 11352 1726776698.94036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11352 1726776698.94061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11352 1726776698.94082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11352 1726776698.94097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11352 1726776698.94108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11352 1726776698.94133: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11352 1726776698.94139: variable 'ansible_host' from source: host vars for 'managed_node2' 11352 1726776698.94143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11352 1726776698.94213: Set connection var ansible_connection to ssh 11352 1726776698.94221: Set connection var ansible_pipelining to False 11352 1726776698.94229: Set connection var ansible_timeout to 10 11352 1726776698.94237: Set connection var ansible_module_compression to ZIP_DEFLATED 11352 1726776698.94243: Set connection var ansible_shell_type to sh 11352 1726776698.94248: Set connection var ansible_shell_executable to /bin/sh 11352 1726776698.94262: variable 'ansible_shell_executable' from source: unknown 11352 1726776698.94268: variable 'ansible_connection' from source: unknown 11352 1726776698.94272: variable 'ansible_module_compression' from source: unknown 11352 1726776698.94276: variable 'ansible_shell_type' from source: unknown 11352 1726776698.94279: variable 'ansible_shell_executable' from source: unknown 11352 1726776698.94282: variable 'ansible_host' from source: host vars for 'managed_node2' 11352 1726776698.94286: variable 'ansible_pipelining' from source: unknown 11352 1726776698.94290: variable 'ansible_timeout' from source: unknown 11352 1726776698.94294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11352 1726776698.94381: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11352 1726776698.94393: variable 'omit' from source: magic vars 11352 1726776698.94400: starting attempt loop 11352 1726776698.94403: running the handler 11352 1726776698.94417: _low_level_execute_command(): starting 11352 1726776698.94425: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11352 1726776698.96733: stdout chunk (state=2): >>>/root <<< 11352 1726776698.96845: stderr chunk (state=3): >>><<< 11352 1726776698.96852: stdout chunk (state=3): >>><<< 11352 1726776698.96870: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11352 1726776698.96883: _low_level_execute_command(): starting 11352 1726776698.96888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105 `" && echo ansible-tmp-1726776698.9687808-11352-87704570444105="` echo /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105 `" ) && sleep 0' 11352 1726776698.99384: stdout chunk (state=2): >>>ansible-tmp-1726776698.9687808-11352-87704570444105=/root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105 <<< 11352 1726776698.99505: stderr chunk (state=3): >>><<< 11352 1726776698.99511: stdout chunk (state=3): >>><<< 11352 1726776698.99524: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776698.9687808-11352-87704570444105=/root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105 , stderr= 11352 1726776698.99548: variable 'ansible_module_compression' from source: unknown 11352 1726776698.99587: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11352 1726776698.99618: variable 'ansible_facts' from source: unknown 11352 1726776698.99690: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105/AnsiballZ_command.py 11352 1726776698.99785: Sending initial data 11352 1726776698.99792: Sent initial data (154 bytes) 11352 1726776699.02239: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp9le8a1a5 /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105/AnsiballZ_command.py <<< 11352 1726776699.03273: stderr chunk (state=3): >>><<< 11352 1726776699.03280: stdout chunk (state=3): >>><<< 11352 1726776699.03297: done transferring module to remote 11352 1726776699.03307: _low_level_execute_command(): starting 11352 1726776699.03312: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105/ /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105/AnsiballZ_command.py && sleep 0' 11352 1726776699.05602: stderr chunk (state=2): >>><<< 11352 1726776699.05610: stdout chunk (state=2): >>><<< 11352 1726776699.05621: _low_level_execute_command() done: rc=0, stdout=, stderr= 11352 1726776699.05625: _low_level_execute_command(): starting 11352 1726776699.05631: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105/AnsiballZ_command.py && sleep 0' 11352 1726776699.21326: stdout chunk (state=2): >>> {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[vm]\ntransparent_hugepages = never", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 16:11:39.208688", "end": "2024-09-19 16:11:39.211547", "delta": "0:00:00.002859", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11352 1726776699.22497: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11352 1726776699.22546: stderr chunk (state=3): >>><<< 11352 1726776699.22553: stdout chunk (state=3): >>><<< 11352 1726776699.22573: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[vm]\ntransparent_hugepages = never", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 16:11:39.208688", "end": "2024-09-19 16:11:39.211547", "delta": "0:00:00.002859", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11352 1726776699.22607: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11352 1726776699.22617: _low_level_execute_command(): starting 11352 1726776699.22624: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776698.9687808-11352-87704570444105/ > /dev/null 2>&1 && sleep 0' 11352 1726776699.25043: stderr chunk (state=2): >>><<< 11352 1726776699.25051: stdout chunk (state=2): >>><<< 11352 1726776699.25068: _low_level_execute_command() done: rc=0, stdout=, stderr= 11352 1726776699.25076: handler run complete 11352 1726776699.25094: Evaluated conditional (False): False 11352 1726776699.25104: attempt loop complete, returning result 11352 1726776699.25108: _execute() done 11352 1726776699.25111: dumping result to json 11352 1726776699.25117: done dumping result, returning 11352 1726776699.25123: done running TaskExecutor() for managed_node2/TASK: Show current tuned profile settings [120fa90a-8a95-cec2-986e-000000000cab] 11352 1726776699.25133: sending task result for task 120fa90a-8a95-cec2-986e-000000000cab 11352 1726776699.25168: done sending task result for task 120fa90a-8a95-cec2-986e-000000000cab 11352 1726776699.25172: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/tuned/kernel_settings/tuned.conf" ], "delta": "0:00:00.002859", "end": "2024-09-19 16:11:39.211547", "rc": 0, "start": "2024-09-19 16:11:39.208688" } STDOUT: # # Ansible managed # # system_role:kernel_settings [main] summary = kernel settings [vm] transparent_hugepages = never 8218 1726776699.25308: no more pending results, returning what we have 8218 1726776699.25311: results queue empty 8218 1726776699.25312: checking for any_errors_fatal 8218 1726776699.25314: done checking for any_errors_fatal 8218 1726776699.25314: checking for max_fail_percentage 8218 1726776699.25316: done checking for max_fail_percentage 8218 1726776699.25316: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.25317: done checking to see if all hosts have failed 8218 1726776699.25318: getting the remaining hosts for this loop 8218 1726776699.25319: done getting the remaining hosts for this loop 8218 1726776699.25322: getting the next task for host managed_node2 8218 1726776699.25332: done getting next task for host managed_node2 8218 1726776699.25334: ^ task is: TASK: Run role with purge to remove everything 8218 1726776699.25337: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.25340: getting variables 8218 1726776699.25341: in VariableManager get_vars() 8218 1726776699.25375: Calling all_inventory to load vars for managed_node2 8218 1726776699.25378: Calling groups_inventory to load vars for managed_node2 8218 1726776699.25379: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.25388: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.25391: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.25393: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.25517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.25635: done with get_vars() 8218 1726776699.25644: done getting variables TASK [Run role with purge to remove everything] ******************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.330) 0:01:25.087 **** 8218 1726776699.25708: entering _queue_task() for managed_node2/include_role 8218 1726776699.25878: worker is 1 (out of 1 available) 8218 1726776699.25892: exiting _queue_task() for managed_node2/include_role 8218 1726776699.25904: done queuing things up, now waiting for results queue to drain 8218 1726776699.25905: waiting for pending results... 11361 1726776699.26037: running TaskExecutor() for managed_node2/TASK: Run role with purge to remove everything 11361 1726776699.26155: in run() - task 120fa90a-8a95-cec2-986e-000000000cad 11361 1726776699.26173: variable 'ansible_search_path' from source: unknown 11361 1726776699.26177: variable 'ansible_search_path' from source: unknown 11361 1726776699.26205: calling self._execute() 11361 1726776699.26279: variable 'ansible_host' from source: host vars for 'managed_node2' 11361 1726776699.26290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11361 1726776699.26298: variable 'omit' from source: magic vars 11361 1726776699.26379: _execute() done 11361 1726776699.26385: dumping result to json 11361 1726776699.26389: done dumping result, returning 11361 1726776699.26396: done running TaskExecutor() for managed_node2/TASK: Run role with purge to remove everything [120fa90a-8a95-cec2-986e-000000000cad] 11361 1726776699.26404: sending task result for task 120fa90a-8a95-cec2-986e-000000000cad 11361 1726776699.26435: done sending task result for task 120fa90a-8a95-cec2-986e-000000000cad 11361 1726776699.26439: WORKER PROCESS EXITING 8218 1726776699.26544: no more pending results, returning what we have 8218 1726776699.26548: in VariableManager get_vars() 8218 1726776699.26583: Calling all_inventory to load vars for managed_node2 8218 1726776699.26586: Calling groups_inventory to load vars for managed_node2 8218 1726776699.26588: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.26596: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.26598: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.26601: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.26748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.26854: done with get_vars() 8218 1726776699.26860: variable 'ansible_search_path' from source: unknown 8218 1726776699.26860: variable 'ansible_search_path' from source: unknown 8218 1726776699.27048: variable 'omit' from source: magic vars 8218 1726776699.27071: variable 'omit' from source: magic vars 8218 1726776699.27080: variable 'omit' from source: magic vars 8218 1726776699.27082: we have included files to process 8218 1726776699.27083: generating all_blocks data 8218 1726776699.27084: done generating all_blocks data 8218 1726776699.27086: processing included file: fedora.linux_system_roles.kernel_settings 8218 1726776699.27099: in VariableManager get_vars() 8218 1726776699.27110: done with get_vars() 8218 1726776699.27131: in VariableManager get_vars() 8218 1726776699.27143: done with get_vars() 8218 1726776699.27170: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8218 1726776699.27209: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8218 1726776699.27225: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8218 1726776699.27274: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8218 1726776699.27600: in VariableManager get_vars() 8218 1726776699.27614: done with get_vars() 8218 1726776699.28423: in VariableManager get_vars() 8218 1726776699.28438: done with get_vars() 8218 1726776699.28540: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8218 1726776699.28920: iterating over new_blocks loaded from include file 8218 1726776699.28921: in VariableManager get_vars() 8218 1726776699.28935: done with get_vars() 8218 1726776699.28937: filtering new block on tags 8218 1726776699.28977: done filtering new block on tags 8218 1726776699.28979: in VariableManager get_vars() 8218 1726776699.28989: done with get_vars() 8218 1726776699.28990: filtering new block on tags 8218 1726776699.29017: done filtering new block on tags 8218 1726776699.29018: in VariableManager get_vars() 8218 1726776699.29031: done with get_vars() 8218 1726776699.29032: filtering new block on tags 8218 1726776699.29136: done filtering new block on tags 8218 1726776699.29139: in VariableManager get_vars() 8218 1726776699.29150: done with get_vars() 8218 1726776699.29151: filtering new block on tags 8218 1726776699.29162: done filtering new block on tags 8218 1726776699.29163: done iterating over new_blocks loaded from include file 8218 1726776699.29163: extending task lists for all hosts with included blocks 8218 1726776699.29345: done extending task lists 8218 1726776699.29346: done processing included files 8218 1726776699.29347: results queue empty 8218 1726776699.29347: checking for any_errors_fatal 8218 1726776699.29350: done checking for any_errors_fatal 8218 1726776699.29351: checking for max_fail_percentage 8218 1726776699.29351: done checking for max_fail_percentage 8218 1726776699.29352: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.29352: done checking to see if all hosts have failed 8218 1726776699.29353: getting the remaining hosts for this loop 8218 1726776699.29353: done getting the remaining hosts for this loop 8218 1726776699.29355: getting the next task for host managed_node2 8218 1726776699.29357: done getting next task for host managed_node2 8218 1726776699.29359: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8218 1726776699.29360: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.29369: getting variables 8218 1726776699.29369: in VariableManager get_vars() 8218 1726776699.29378: Calling all_inventory to load vars for managed_node2 8218 1726776699.29379: Calling groups_inventory to load vars for managed_node2 8218 1726776699.29380: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.29384: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.29385: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.29387: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.29466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.29572: done with get_vars() 8218 1726776699.29579: done getting variables 8218 1726776699.29602: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.039) 0:01:25.126 **** 8218 1726776699.29627: entering _queue_task() for managed_node2/fail 8218 1726776699.29792: worker is 1 (out of 1 available) 8218 1726776699.29806: exiting _queue_task() for managed_node2/fail 8218 1726776699.29817: done queuing things up, now waiting for results queue to drain 8218 1726776699.29819: waiting for pending results... 11362 1726776699.29945: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 11362 1726776699.30061: in run() - task 120fa90a-8a95-cec2-986e-000000000ed1 11362 1726776699.30077: variable 'ansible_search_path' from source: unknown 11362 1726776699.30081: variable 'ansible_search_path' from source: unknown 11362 1726776699.30108: calling self._execute() 11362 1726776699.30177: variable 'ansible_host' from source: host vars for 'managed_node2' 11362 1726776699.30186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11362 1726776699.30195: variable 'omit' from source: magic vars 11362 1726776699.30533: variable 'kernel_settings_sysctl' from source: include params 11362 1726776699.30543: variable '__kernel_settings_state_empty' from source: role '' all vars 11362 1726776699.30554: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 11362 1726776699.30747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11362 1726776699.32409: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11362 1726776699.32456: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11362 1726776699.32495: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11362 1726776699.32521: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11362 1726776699.32545: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11362 1726776699.32596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11362 1726776699.32617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11362 1726776699.32638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11362 1726776699.32666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11362 1726776699.32678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11362 1726776699.32714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11362 1726776699.32733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11362 1726776699.32750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11362 1726776699.32778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11362 1726776699.32789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11362 1726776699.32816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11362 1726776699.32835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11362 1726776699.32852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11362 1726776699.32879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11362 1726776699.32890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11362 1726776699.33066: variable 'kernel_settings_sysctl' from source: include params 11362 1726776699.33087: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 11362 1726776699.33092: when evaluation is False, skipping this task 11362 1726776699.33096: _execute() done 11362 1726776699.33099: dumping result to json 11362 1726776699.33103: done dumping result, returning 11362 1726776699.33109: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-cec2-986e-000000000ed1] 11362 1726776699.33115: sending task result for task 120fa90a-8a95-cec2-986e-000000000ed1 11362 1726776699.33138: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ed1 11362 1726776699.33141: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8218 1726776699.33246: no more pending results, returning what we have 8218 1726776699.33250: results queue empty 8218 1726776699.33251: checking for any_errors_fatal 8218 1726776699.33253: done checking for any_errors_fatal 8218 1726776699.33254: checking for max_fail_percentage 8218 1726776699.33255: done checking for max_fail_percentage 8218 1726776699.33256: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.33256: done checking to see if all hosts have failed 8218 1726776699.33257: getting the remaining hosts for this loop 8218 1726776699.33258: done getting the remaining hosts for this loop 8218 1726776699.33261: getting the next task for host managed_node2 8218 1726776699.33269: done getting next task for host managed_node2 8218 1726776699.33273: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8218 1726776699.33276: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.33296: getting variables 8218 1726776699.33297: in VariableManager get_vars() 8218 1726776699.33326: Calling all_inventory to load vars for managed_node2 8218 1726776699.33331: Calling groups_inventory to load vars for managed_node2 8218 1726776699.33333: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.33341: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.33343: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.33346: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.33463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.33747: done with get_vars() 8218 1726776699.33754: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.041) 0:01:25.168 **** 8218 1726776699.33818: entering _queue_task() for managed_node2/include_tasks 8218 1726776699.33977: worker is 1 (out of 1 available) 8218 1726776699.33991: exiting _queue_task() for managed_node2/include_tasks 8218 1726776699.34002: done queuing things up, now waiting for results queue to drain 8218 1726776699.34004: waiting for pending results... 11363 1726776699.34123: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 11363 1726776699.34253: in run() - task 120fa90a-8a95-cec2-986e-000000000ed2 11363 1726776699.34268: variable 'ansible_search_path' from source: unknown 11363 1726776699.34272: variable 'ansible_search_path' from source: unknown 11363 1726776699.34300: calling self._execute() 11363 1726776699.34365: variable 'ansible_host' from source: host vars for 'managed_node2' 11363 1726776699.34374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11363 1726776699.34382: variable 'omit' from source: magic vars 11363 1726776699.34457: _execute() done 11363 1726776699.34463: dumping result to json 11363 1726776699.34467: done dumping result, returning 11363 1726776699.34473: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-cec2-986e-000000000ed2] 11363 1726776699.34480: sending task result for task 120fa90a-8a95-cec2-986e-000000000ed2 11363 1726776699.34504: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ed2 11363 1726776699.34508: WORKER PROCESS EXITING 8218 1726776699.34608: no more pending results, returning what we have 8218 1726776699.34612: in VariableManager get_vars() 8218 1726776699.34649: Calling all_inventory to load vars for managed_node2 8218 1726776699.34651: Calling groups_inventory to load vars for managed_node2 8218 1726776699.34653: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.34661: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.34663: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.34668: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.34778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.34893: done with get_vars() 8218 1726776699.34898: variable 'ansible_search_path' from source: unknown 8218 1726776699.34899: variable 'ansible_search_path' from source: unknown 8218 1726776699.34921: we have included files to process 8218 1726776699.34922: generating all_blocks data 8218 1726776699.34923: done generating all_blocks data 8218 1726776699.34927: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776699.34930: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8218 1726776699.34932: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8218 1726776699.35372: done processing included file 8218 1726776699.35375: iterating over new_blocks loaded from include file 8218 1726776699.35375: in VariableManager get_vars() 8218 1726776699.35391: done with get_vars() 8218 1726776699.35392: filtering new block on tags 8218 1726776699.35410: done filtering new block on tags 8218 1726776699.35434: in VariableManager get_vars() 8218 1726776699.35450: done with get_vars() 8218 1726776699.35451: filtering new block on tags 8218 1726776699.35478: done filtering new block on tags 8218 1726776699.35479: in VariableManager get_vars() 8218 1726776699.35493: done with get_vars() 8218 1726776699.35494: filtering new block on tags 8218 1726776699.35517: done filtering new block on tags 8218 1726776699.35518: in VariableManager get_vars() 8218 1726776699.35535: done with get_vars() 8218 1726776699.35536: filtering new block on tags 8218 1726776699.35551: done filtering new block on tags 8218 1726776699.35552: done iterating over new_blocks loaded from include file 8218 1726776699.35553: extending task lists for all hosts with included blocks 8218 1726776699.35684: done extending task lists 8218 1726776699.35685: done processing included files 8218 1726776699.35685: results queue empty 8218 1726776699.35685: checking for any_errors_fatal 8218 1726776699.35688: done checking for any_errors_fatal 8218 1726776699.35689: checking for max_fail_percentage 8218 1726776699.35689: done checking for max_fail_percentage 8218 1726776699.35690: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.35690: done checking to see if all hosts have failed 8218 1726776699.35691: getting the remaining hosts for this loop 8218 1726776699.35691: done getting the remaining hosts for this loop 8218 1726776699.35693: getting the next task for host managed_node2 8218 1726776699.35695: done getting next task for host managed_node2 8218 1726776699.35697: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8218 1726776699.35700: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.35706: getting variables 8218 1726776699.35707: in VariableManager get_vars() 8218 1726776699.35715: Calling all_inventory to load vars for managed_node2 8218 1726776699.35716: Calling groups_inventory to load vars for managed_node2 8218 1726776699.35718: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.35721: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.35722: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.35723: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.35800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.35908: done with get_vars() 8218 1726776699.35914: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.021) 0:01:25.190 **** 8218 1726776699.35963: entering _queue_task() for managed_node2/setup 8218 1726776699.36119: worker is 1 (out of 1 available) 8218 1726776699.36134: exiting _queue_task() for managed_node2/setup 8218 1726776699.36146: done queuing things up, now waiting for results queue to drain 8218 1726776699.36147: waiting for pending results... 11364 1726776699.36270: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 11364 1726776699.36401: in run() - task 120fa90a-8a95-cec2-986e-000000000f4d 11364 1726776699.36417: variable 'ansible_search_path' from source: unknown 11364 1726776699.36420: variable 'ansible_search_path' from source: unknown 11364 1726776699.36448: calling self._execute() 11364 1726776699.36569: variable 'ansible_host' from source: host vars for 'managed_node2' 11364 1726776699.36577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11364 1726776699.36586: variable 'omit' from source: magic vars 11364 1726776699.36934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11364 1726776699.38461: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11364 1726776699.38514: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11364 1726776699.38548: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11364 1726776699.38576: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11364 1726776699.38597: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11364 1726776699.38653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11364 1726776699.38675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11364 1726776699.38693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11364 1726776699.38719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11364 1726776699.38732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11364 1726776699.38770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11364 1726776699.38788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11364 1726776699.38805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11364 1726776699.38832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11364 1726776699.38843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11364 1726776699.38959: variable '__kernel_settings_required_facts' from source: role '' all vars 11364 1726776699.38970: variable 'ansible_facts' from source: unknown 11364 1726776699.39026: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11364 1726776699.39034: when evaluation is False, skipping this task 11364 1726776699.39038: _execute() done 11364 1726776699.39041: dumping result to json 11364 1726776699.39044: done dumping result, returning 11364 1726776699.39051: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-cec2-986e-000000000f4d] 11364 1726776699.39057: sending task result for task 120fa90a-8a95-cec2-986e-000000000f4d 11364 1726776699.39079: done sending task result for task 120fa90a-8a95-cec2-986e-000000000f4d 11364 1726776699.39083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8218 1726776699.39256: no more pending results, returning what we have 8218 1726776699.39259: results queue empty 8218 1726776699.39260: checking for any_errors_fatal 8218 1726776699.39261: done checking for any_errors_fatal 8218 1726776699.39262: checking for max_fail_percentage 8218 1726776699.39263: done checking for max_fail_percentage 8218 1726776699.39263: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.39264: done checking to see if all hosts have failed 8218 1726776699.39267: getting the remaining hosts for this loop 8218 1726776699.39268: done getting the remaining hosts for this loop 8218 1726776699.39271: getting the next task for host managed_node2 8218 1726776699.39279: done getting next task for host managed_node2 8218 1726776699.39282: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8218 1726776699.39286: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.39301: getting variables 8218 1726776699.39302: in VariableManager get_vars() 8218 1726776699.39326: Calling all_inventory to load vars for managed_node2 8218 1726776699.39330: Calling groups_inventory to load vars for managed_node2 8218 1726776699.39332: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.39338: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.39340: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.39341: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.39444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.39566: done with get_vars() 8218 1726776699.39574: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.036) 0:01:25.226 **** 8218 1726776699.39642: entering _queue_task() for managed_node2/stat 8218 1726776699.39794: worker is 1 (out of 1 available) 8218 1726776699.39808: exiting _queue_task() for managed_node2/stat 8218 1726776699.39819: done queuing things up, now waiting for results queue to drain 8218 1726776699.39821: waiting for pending results... 11365 1726776699.39945: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 11365 1726776699.40074: in run() - task 120fa90a-8a95-cec2-986e-000000000f4f 11365 1726776699.40089: variable 'ansible_search_path' from source: unknown 11365 1726776699.40093: variable 'ansible_search_path' from source: unknown 11365 1726776699.40119: calling self._execute() 11365 1726776699.40189: variable 'ansible_host' from source: host vars for 'managed_node2' 11365 1726776699.40198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11365 1726776699.40207: variable 'omit' from source: magic vars 11365 1726776699.40521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11365 1726776699.40694: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11365 1726776699.40753: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11365 1726776699.40781: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11365 1726776699.40810: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11365 1726776699.40868: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11365 1726776699.40888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11365 1726776699.40906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11365 1726776699.40924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11365 1726776699.41012: variable '__kernel_settings_is_ostree' from source: set_fact 11365 1726776699.41025: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11365 1726776699.41033: when evaluation is False, skipping this task 11365 1726776699.41037: _execute() done 11365 1726776699.41040: dumping result to json 11365 1726776699.41043: done dumping result, returning 11365 1726776699.41048: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-cec2-986e-000000000f4f] 11365 1726776699.41054: sending task result for task 120fa90a-8a95-cec2-986e-000000000f4f 11365 1726776699.41079: done sending task result for task 120fa90a-8a95-cec2-986e-000000000f4f 11365 1726776699.41083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776699.41181: no more pending results, returning what we have 8218 1726776699.41184: results queue empty 8218 1726776699.41185: checking for any_errors_fatal 8218 1726776699.41191: done checking for any_errors_fatal 8218 1726776699.41192: checking for max_fail_percentage 8218 1726776699.41193: done checking for max_fail_percentage 8218 1726776699.41194: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.41194: done checking to see if all hosts have failed 8218 1726776699.41195: getting the remaining hosts for this loop 8218 1726776699.41196: done getting the remaining hosts for this loop 8218 1726776699.41199: getting the next task for host managed_node2 8218 1726776699.41205: done getting next task for host managed_node2 8218 1726776699.41208: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8218 1726776699.41212: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.41227: getting variables 8218 1726776699.41230: in VariableManager get_vars() 8218 1726776699.41260: Calling all_inventory to load vars for managed_node2 8218 1726776699.41262: Calling groups_inventory to load vars for managed_node2 8218 1726776699.41264: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.41271: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.41274: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.41276: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.41379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.41518: done with get_vars() 8218 1726776699.41525: done getting variables 8218 1726776699.41566: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.019) 0:01:25.246 **** 8218 1726776699.41589: entering _queue_task() for managed_node2/set_fact 8218 1726776699.41734: worker is 1 (out of 1 available) 8218 1726776699.41744: exiting _queue_task() for managed_node2/set_fact 8218 1726776699.41756: done queuing things up, now waiting for results queue to drain 8218 1726776699.41758: waiting for pending results... 11366 1726776699.41883: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 11366 1726776699.42005: in run() - task 120fa90a-8a95-cec2-986e-000000000f50 11366 1726776699.42020: variable 'ansible_search_path' from source: unknown 11366 1726776699.42024: variable 'ansible_search_path' from source: unknown 11366 1726776699.42050: calling self._execute() 11366 1726776699.42118: variable 'ansible_host' from source: host vars for 'managed_node2' 11366 1726776699.42126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11366 1726776699.42137: variable 'omit' from source: magic vars 11366 1726776699.42451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11366 1726776699.42619: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11366 1726776699.42654: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11366 1726776699.42683: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11366 1726776699.42709: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11366 1726776699.42767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11366 1726776699.42787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11366 1726776699.42805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11366 1726776699.42822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11366 1726776699.42911: variable '__kernel_settings_is_ostree' from source: set_fact 11366 1726776699.42922: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11366 1726776699.42926: when evaluation is False, skipping this task 11366 1726776699.42933: _execute() done 11366 1726776699.42936: dumping result to json 11366 1726776699.42940: done dumping result, returning 11366 1726776699.42946: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-cec2-986e-000000000f50] 11366 1726776699.42952: sending task result for task 120fa90a-8a95-cec2-986e-000000000f50 11366 1726776699.42976: done sending task result for task 120fa90a-8a95-cec2-986e-000000000f50 11366 1726776699.42980: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8218 1726776699.43099: no more pending results, returning what we have 8218 1726776699.43102: results queue empty 8218 1726776699.43103: checking for any_errors_fatal 8218 1726776699.43107: done checking for any_errors_fatal 8218 1726776699.43108: checking for max_fail_percentage 8218 1726776699.43109: done checking for max_fail_percentage 8218 1726776699.43110: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.43111: done checking to see if all hosts have failed 8218 1726776699.43111: getting the remaining hosts for this loop 8218 1726776699.43112: done getting the remaining hosts for this loop 8218 1726776699.43115: getting the next task for host managed_node2 8218 1726776699.43123: done getting next task for host managed_node2 8218 1726776699.43128: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8218 1726776699.43134: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.43147: getting variables 8218 1726776699.43148: in VariableManager get_vars() 8218 1726776699.43172: Calling all_inventory to load vars for managed_node2 8218 1726776699.43174: Calling groups_inventory to load vars for managed_node2 8218 1726776699.43175: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.43181: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.43182: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.43184: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.43285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.43401: done with get_vars() 8218 1726776699.43409: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.018) 0:01:25.265 **** 8218 1726776699.43473: entering _queue_task() for managed_node2/stat 8218 1726776699.43613: worker is 1 (out of 1 available) 8218 1726776699.43625: exiting _queue_task() for managed_node2/stat 8218 1726776699.43637: done queuing things up, now waiting for results queue to drain 8218 1726776699.43639: waiting for pending results... 11367 1726776699.43760: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 11367 1726776699.43891: in run() - task 120fa90a-8a95-cec2-986e-000000000f52 11367 1726776699.43906: variable 'ansible_search_path' from source: unknown 11367 1726776699.43910: variable 'ansible_search_path' from source: unknown 11367 1726776699.43938: calling self._execute() 11367 1726776699.44004: variable 'ansible_host' from source: host vars for 'managed_node2' 11367 1726776699.44012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11367 1726776699.44020: variable 'omit' from source: magic vars 11367 1726776699.44339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11367 1726776699.44568: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11367 1726776699.44600: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11367 1726776699.44627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11367 1726776699.44654: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11367 1726776699.44713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11367 1726776699.44737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11367 1726776699.44755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11367 1726776699.44776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11367 1726776699.44859: variable '__kernel_settings_is_transactional' from source: set_fact 11367 1726776699.44873: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11367 1726776699.44877: when evaluation is False, skipping this task 11367 1726776699.44881: _execute() done 11367 1726776699.44884: dumping result to json 11367 1726776699.44888: done dumping result, returning 11367 1726776699.44894: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-cec2-986e-000000000f52] 11367 1726776699.44900: sending task result for task 120fa90a-8a95-cec2-986e-000000000f52 11367 1726776699.44921: done sending task result for task 120fa90a-8a95-cec2-986e-000000000f52 11367 1726776699.44924: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776699.45022: no more pending results, returning what we have 8218 1726776699.45025: results queue empty 8218 1726776699.45026: checking for any_errors_fatal 8218 1726776699.45034: done checking for any_errors_fatal 8218 1726776699.45035: checking for max_fail_percentage 8218 1726776699.45036: done checking for max_fail_percentage 8218 1726776699.45037: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.45037: done checking to see if all hosts have failed 8218 1726776699.45038: getting the remaining hosts for this loop 8218 1726776699.45039: done getting the remaining hosts for this loop 8218 1726776699.45042: getting the next task for host managed_node2 8218 1726776699.45049: done getting next task for host managed_node2 8218 1726776699.45052: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8218 1726776699.45056: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.45071: getting variables 8218 1726776699.45073: in VariableManager get_vars() 8218 1726776699.45102: Calling all_inventory to load vars for managed_node2 8218 1726776699.45105: Calling groups_inventory to load vars for managed_node2 8218 1726776699.45107: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.45114: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.45116: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.45119: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.45264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.45379: done with get_vars() 8218 1726776699.45386: done getting variables 8218 1726776699.45424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.019) 0:01:25.285 **** 8218 1726776699.45452: entering _queue_task() for managed_node2/set_fact 8218 1726776699.45596: worker is 1 (out of 1 available) 8218 1726776699.45609: exiting _queue_task() for managed_node2/set_fact 8218 1726776699.45622: done queuing things up, now waiting for results queue to drain 8218 1726776699.45623: waiting for pending results... 11368 1726776699.45752: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 11368 1726776699.45883: in run() - task 120fa90a-8a95-cec2-986e-000000000f53 11368 1726776699.45898: variable 'ansible_search_path' from source: unknown 11368 1726776699.45902: variable 'ansible_search_path' from source: unknown 11368 1726776699.45927: calling self._execute() 11368 1726776699.45998: variable 'ansible_host' from source: host vars for 'managed_node2' 11368 1726776699.46005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11368 1726776699.46015: variable 'omit' from source: magic vars 11368 1726776699.46337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11368 1726776699.46516: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11368 1726776699.46551: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11368 1726776699.46579: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11368 1726776699.46608: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11368 1726776699.46672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11368 1726776699.46692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11368 1726776699.46710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11368 1726776699.46730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11368 1726776699.46820: variable '__kernel_settings_is_transactional' from source: set_fact 11368 1726776699.46834: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11368 1726776699.46838: when evaluation is False, skipping this task 11368 1726776699.46842: _execute() done 11368 1726776699.46846: dumping result to json 11368 1726776699.46850: done dumping result, returning 11368 1726776699.46856: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-cec2-986e-000000000f53] 11368 1726776699.46863: sending task result for task 120fa90a-8a95-cec2-986e-000000000f53 11368 1726776699.46886: done sending task result for task 120fa90a-8a95-cec2-986e-000000000f53 11368 1726776699.46890: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8218 1726776699.46984: no more pending results, returning what we have 8218 1726776699.46987: results queue empty 8218 1726776699.46987: checking for any_errors_fatal 8218 1726776699.46992: done checking for any_errors_fatal 8218 1726776699.46992: checking for max_fail_percentage 8218 1726776699.46994: done checking for max_fail_percentage 8218 1726776699.46994: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.46995: done checking to see if all hosts have failed 8218 1726776699.46995: getting the remaining hosts for this loop 8218 1726776699.46997: done getting the remaining hosts for this loop 8218 1726776699.46999: getting the next task for host managed_node2 8218 1726776699.47007: done getting next task for host managed_node2 8218 1726776699.47010: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8218 1726776699.47014: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.47032: getting variables 8218 1726776699.47033: in VariableManager get_vars() 8218 1726776699.47063: Calling all_inventory to load vars for managed_node2 8218 1726776699.47065: Calling groups_inventory to load vars for managed_node2 8218 1726776699.47067: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.47075: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.47077: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.47079: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.47183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.47304: done with get_vars() 8218 1726776699.47311: done getting variables 8218 1726776699.47350: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.019) 0:01:25.304 **** 8218 1726776699.47374: entering _queue_task() for managed_node2/include_vars 8218 1726776699.47517: worker is 1 (out of 1 available) 8218 1726776699.47533: exiting _queue_task() for managed_node2/include_vars 8218 1726776699.47545: done queuing things up, now waiting for results queue to drain 8218 1726776699.47546: waiting for pending results... 11369 1726776699.47669: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 11369 1726776699.47789: in run() - task 120fa90a-8a95-cec2-986e-000000000f55 11369 1726776699.47806: variable 'ansible_search_path' from source: unknown 11369 1726776699.47810: variable 'ansible_search_path' from source: unknown 11369 1726776699.47838: calling self._execute() 11369 1726776699.47905: variable 'ansible_host' from source: host vars for 'managed_node2' 11369 1726776699.47914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11369 1726776699.47922: variable 'omit' from source: magic vars 11369 1726776699.47997: variable 'omit' from source: magic vars 11369 1726776699.48048: variable 'omit' from source: magic vars 11369 1726776699.48304: variable 'ffparams' from source: task vars 11369 1726776699.48451: variable 'ansible_facts' from source: unknown 11369 1726776699.48572: variable 'ansible_facts' from source: unknown 11369 1726776699.48658: variable 'ansible_facts' from source: unknown 11369 1726776699.48743: variable 'ansible_facts' from source: unknown 11369 1726776699.48817: variable 'role_path' from source: magic vars 11369 1726776699.48931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11369 1726776699.49071: Loaded config def from plugin (lookup/first_found) 11369 1726776699.49079: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 11369 1726776699.49105: variable 'ansible_search_path' from source: unknown 11369 1726776699.49123: variable 'ansible_search_path' from source: unknown 11369 1726776699.49134: variable 'ansible_search_path' from source: unknown 11369 1726776699.49142: variable 'ansible_search_path' from source: unknown 11369 1726776699.49149: variable 'ansible_search_path' from source: unknown 11369 1726776699.49162: variable 'omit' from source: magic vars 11369 1726776699.49181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11369 1726776699.49199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11369 1726776699.49215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11369 1726776699.49228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11369 1726776699.49239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11369 1726776699.49259: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11369 1726776699.49263: variable 'ansible_host' from source: host vars for 'managed_node2' 11369 1726776699.49267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11369 1726776699.49325: Set connection var ansible_connection to ssh 11369 1726776699.49335: Set connection var ansible_pipelining to False 11369 1726776699.49341: Set connection var ansible_timeout to 10 11369 1726776699.49348: Set connection var ansible_module_compression to ZIP_DEFLATED 11369 1726776699.49353: Set connection var ansible_shell_type to sh 11369 1726776699.49359: Set connection var ansible_shell_executable to /bin/sh 11369 1726776699.49374: variable 'ansible_shell_executable' from source: unknown 11369 1726776699.49378: variable 'ansible_connection' from source: unknown 11369 1726776699.49381: variable 'ansible_module_compression' from source: unknown 11369 1726776699.49384: variable 'ansible_shell_type' from source: unknown 11369 1726776699.49388: variable 'ansible_shell_executable' from source: unknown 11369 1726776699.49391: variable 'ansible_host' from source: host vars for 'managed_node2' 11369 1726776699.49395: variable 'ansible_pipelining' from source: unknown 11369 1726776699.49398: variable 'ansible_timeout' from source: unknown 11369 1726776699.49400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11369 1726776699.49472: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11369 1726776699.49484: variable 'omit' from source: magic vars 11369 1726776699.49489: starting attempt loop 11369 1726776699.49493: running the handler 11369 1726776699.49535: handler run complete 11369 1726776699.49545: attempt loop complete, returning result 11369 1726776699.49548: _execute() done 11369 1726776699.49551: dumping result to json 11369 1726776699.49556: done dumping result, returning 11369 1726776699.49561: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-cec2-986e-000000000f55] 11369 1726776699.49568: sending task result for task 120fa90a-8a95-cec2-986e-000000000f55 11369 1726776699.49592: done sending task result for task 120fa90a-8a95-cec2-986e-000000000f55 11369 1726776699.49596: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8218 1726776699.49730: no more pending results, returning what we have 8218 1726776699.49733: results queue empty 8218 1726776699.49734: checking for any_errors_fatal 8218 1726776699.49739: done checking for any_errors_fatal 8218 1726776699.49739: checking for max_fail_percentage 8218 1726776699.49741: done checking for max_fail_percentage 8218 1726776699.49741: checking to see if all hosts have failed and the running result is not ok 8218 1726776699.49742: done checking to see if all hosts have failed 8218 1726776699.49742: getting the remaining hosts for this loop 8218 1726776699.49743: done getting the remaining hosts for this loop 8218 1726776699.49746: getting the next task for host managed_node2 8218 1726776699.49753: done getting next task for host managed_node2 8218 1726776699.49756: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8218 1726776699.49759: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776699.49772: getting variables 8218 1726776699.49773: in VariableManager get_vars() 8218 1726776699.49799: Calling all_inventory to load vars for managed_node2 8218 1726776699.49801: Calling groups_inventory to load vars for managed_node2 8218 1726776699.49803: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776699.49812: Calling all_plugins_play to load vars for managed_node2 8218 1726776699.49814: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776699.49816: Calling groups_plugins_play to load vars for managed_node2 8218 1726776699.49957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776699.50079: done with get_vars() 8218 1726776699.50086: done getting variables 8218 1726776699.50123: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:11:39 -0400 (0:00:00.027) 0:01:25.332 **** 8218 1726776699.50148: entering _queue_task() for managed_node2/package 8218 1726776699.50293: worker is 1 (out of 1 available) 8218 1726776699.50306: exiting _queue_task() for managed_node2/package 8218 1726776699.50318: done queuing things up, now waiting for results queue to drain 8218 1726776699.50319: waiting for pending results... 11370 1726776699.50441: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 11370 1726776699.50554: in run() - task 120fa90a-8a95-cec2-986e-000000000ed3 11370 1726776699.50571: variable 'ansible_search_path' from source: unknown 11370 1726776699.50575: variable 'ansible_search_path' from source: unknown 11370 1726776699.50600: calling self._execute() 11370 1726776699.50668: variable 'ansible_host' from source: host vars for 'managed_node2' 11370 1726776699.50676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11370 1726776699.50684: variable 'omit' from source: magic vars 11370 1726776699.50754: variable 'omit' from source: magic vars 11370 1726776699.50793: variable 'omit' from source: magic vars 11370 1726776699.50811: variable '__kernel_settings_packages' from source: include_vars 11370 1726776699.51011: variable '__kernel_settings_packages' from source: include_vars 11370 1726776699.51163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11370 1726776699.52633: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11370 1726776699.52686: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11370 1726776699.52715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11370 1726776699.52742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11370 1726776699.52762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11370 1726776699.52826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11370 1726776699.52848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11370 1726776699.52867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11370 1726776699.52894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11370 1726776699.52905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11370 1726776699.52976: variable '__kernel_settings_is_ostree' from source: set_fact 11370 1726776699.52982: variable 'omit' from source: magic vars 11370 1726776699.53003: variable 'omit' from source: magic vars 11370 1726776699.53023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11370 1726776699.53044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11370 1726776699.53060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11370 1726776699.53074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11370 1726776699.53083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11370 1726776699.53105: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11370 1726776699.53110: variable 'ansible_host' from source: host vars for 'managed_node2' 11370 1726776699.53115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11370 1726776699.53180: Set connection var ansible_connection to ssh 11370 1726776699.53188: Set connection var ansible_pipelining to False 11370 1726776699.53194: Set connection var ansible_timeout to 10 11370 1726776699.53201: Set connection var ansible_module_compression to ZIP_DEFLATED 11370 1726776699.53206: Set connection var ansible_shell_type to sh 11370 1726776699.53211: Set connection var ansible_shell_executable to /bin/sh 11370 1726776699.53227: variable 'ansible_shell_executable' from source: unknown 11370 1726776699.53232: variable 'ansible_connection' from source: unknown 11370 1726776699.53236: variable 'ansible_module_compression' from source: unknown 11370 1726776699.53239: variable 'ansible_shell_type' from source: unknown 11370 1726776699.53243: variable 'ansible_shell_executable' from source: unknown 11370 1726776699.53246: variable 'ansible_host' from source: host vars for 'managed_node2' 11370 1726776699.53250: variable 'ansible_pipelining' from source: unknown 11370 1726776699.53254: variable 'ansible_timeout' from source: unknown 11370 1726776699.53258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11370 1726776699.53317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11370 1726776699.53330: variable 'omit' from source: magic vars 11370 1726776699.53336: starting attempt loop 11370 1726776699.53340: running the handler 11370 1726776699.53396: variable 'ansible_facts' from source: unknown 11370 1726776699.53476: _low_level_execute_command(): starting 11370 1726776699.53485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11370 1726776699.55793: stdout chunk (state=2): >>>/root <<< 11370 1726776699.55914: stderr chunk (state=3): >>><<< 11370 1726776699.55921: stdout chunk (state=3): >>><<< 11370 1726776699.55940: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11370 1726776699.55951: _low_level_execute_command(): starting 11370 1726776699.55957: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225 `" && echo ansible-tmp-1726776699.559473-11370-157770267761225="` echo /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225 `" ) && sleep 0' 11370 1726776699.58536: stdout chunk (state=2): >>>ansible-tmp-1726776699.559473-11370-157770267761225=/root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225 <<< 11370 1726776699.58664: stderr chunk (state=3): >>><<< 11370 1726776699.58671: stdout chunk (state=3): >>><<< 11370 1726776699.58684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776699.559473-11370-157770267761225=/root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225 , stderr= 11370 1726776699.58706: variable 'ansible_module_compression' from source: unknown 11370 1726776699.58752: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11370 1726776699.58790: variable 'ansible_facts' from source: unknown 11370 1726776699.58879: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225/AnsiballZ_dnf.py 11370 1726776699.58974: Sending initial data 11370 1726776699.58980: Sent initial data (150 bytes) 11370 1726776699.61450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpwlv9nv6b /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225/AnsiballZ_dnf.py <<< 11370 1726776699.62789: stderr chunk (state=3): >>><<< 11370 1726776699.62798: stdout chunk (state=3): >>><<< 11370 1726776699.62815: done transferring module to remote 11370 1726776699.62823: _low_level_execute_command(): starting 11370 1726776699.62826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225/ /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225/AnsiballZ_dnf.py && sleep 0' 11370 1726776699.65127: stderr chunk (state=2): >>><<< 11370 1726776699.65137: stdout chunk (state=2): >>><<< 11370 1726776699.65149: _low_level_execute_command() done: rc=0, stdout=, stderr= 11370 1726776699.65153: _low_level_execute_command(): starting 11370 1726776699.65159: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225/AnsiballZ_dnf.py && sleep 0' 11370 1726776702.20637: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 11370 1726776702.28922: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11370 1726776702.28972: stderr chunk (state=3): >>><<< 11370 1726776702.28980: stdout chunk (state=3): >>><<< 11370 1726776702.28996: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11370 1726776702.29035: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11370 1726776702.29044: _low_level_execute_command(): starting 11370 1726776702.29050: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776699.559473-11370-157770267761225/ > /dev/null 2>&1 && sleep 0' 11370 1726776702.31463: stderr chunk (state=2): >>><<< 11370 1726776702.31473: stdout chunk (state=2): >>><<< 11370 1726776702.31488: _low_level_execute_command() done: rc=0, stdout=, stderr= 11370 1726776702.31495: handler run complete 11370 1726776702.31522: attempt loop complete, returning result 11370 1726776702.31526: _execute() done 11370 1726776702.31530: dumping result to json 11370 1726776702.31536: done dumping result, returning 11370 1726776702.31542: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-cec2-986e-000000000ed3] 11370 1726776702.31548: sending task result for task 120fa90a-8a95-cec2-986e-000000000ed3 11370 1726776702.31579: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ed3 11370 1726776702.31583: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8218 1726776702.31750: no more pending results, returning what we have 8218 1726776702.31753: results queue empty 8218 1726776702.31754: checking for any_errors_fatal 8218 1726776702.31762: done checking for any_errors_fatal 8218 1726776702.31762: checking for max_fail_percentage 8218 1726776702.31764: done checking for max_fail_percentage 8218 1726776702.31764: checking to see if all hosts have failed and the running result is not ok 8218 1726776702.31765: done checking to see if all hosts have failed 8218 1726776702.31765: getting the remaining hosts for this loop 8218 1726776702.31766: done getting the remaining hosts for this loop 8218 1726776702.31769: getting the next task for host managed_node2 8218 1726776702.31778: done getting next task for host managed_node2 8218 1726776702.31781: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8218 1726776702.31784: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776702.31793: getting variables 8218 1726776702.31794: in VariableManager get_vars() 8218 1726776702.31830: Calling all_inventory to load vars for managed_node2 8218 1726776702.31833: Calling groups_inventory to load vars for managed_node2 8218 1726776702.31834: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776702.31841: Calling all_plugins_play to load vars for managed_node2 8218 1726776702.31843: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776702.31844: Calling groups_plugins_play to load vars for managed_node2 8218 1726776702.31951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776702.32074: done with get_vars() 8218 1726776702.32083: done getting variables 8218 1726776702.32125: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:11:42 -0400 (0:00:02.820) 0:01:28.152 **** 8218 1726776702.32156: entering _queue_task() for managed_node2/debug 8218 1726776702.32315: worker is 1 (out of 1 available) 8218 1726776702.32331: exiting _queue_task() for managed_node2/debug 8218 1726776702.32342: done queuing things up, now waiting for results queue to drain 8218 1726776702.32344: waiting for pending results... 11387 1726776702.32478: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 11387 1726776702.32602: in run() - task 120fa90a-8a95-cec2-986e-000000000ed5 11387 1726776702.32619: variable 'ansible_search_path' from source: unknown 11387 1726776702.32624: variable 'ansible_search_path' from source: unknown 11387 1726776702.32655: calling self._execute() 11387 1726776702.32722: variable 'ansible_host' from source: host vars for 'managed_node2' 11387 1726776702.32732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11387 1726776702.32742: variable 'omit' from source: magic vars 11387 1726776702.33080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11387 1726776702.34642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11387 1726776702.34690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11387 1726776702.34727: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11387 1726776702.34757: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11387 1726776702.34779: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11387 1726776702.34834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11387 1726776702.34855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11387 1726776702.34876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11387 1726776702.34903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11387 1726776702.34914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11387 1726776702.34993: variable '__kernel_settings_is_transactional' from source: set_fact 11387 1726776702.35009: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11387 1726776702.35013: when evaluation is False, skipping this task 11387 1726776702.35016: _execute() done 11387 1726776702.35020: dumping result to json 11387 1726776702.35025: done dumping result, returning 11387 1726776702.35032: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-000000000ed5] 11387 1726776702.35039: sending task result for task 120fa90a-8a95-cec2-986e-000000000ed5 11387 1726776702.35063: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ed5 11387 1726776702.35069: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8218 1726776702.35189: no more pending results, returning what we have 8218 1726776702.35192: results queue empty 8218 1726776702.35193: checking for any_errors_fatal 8218 1726776702.35201: done checking for any_errors_fatal 8218 1726776702.35201: checking for max_fail_percentage 8218 1726776702.35203: done checking for max_fail_percentage 8218 1726776702.35204: checking to see if all hosts have failed and the running result is not ok 8218 1726776702.35205: done checking to see if all hosts have failed 8218 1726776702.35205: getting the remaining hosts for this loop 8218 1726776702.35206: done getting the remaining hosts for this loop 8218 1726776702.35209: getting the next task for host managed_node2 8218 1726776702.35215: done getting next task for host managed_node2 8218 1726776702.35218: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8218 1726776702.35221: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776702.35239: getting variables 8218 1726776702.35240: in VariableManager get_vars() 8218 1726776702.35271: Calling all_inventory to load vars for managed_node2 8218 1726776702.35274: Calling groups_inventory to load vars for managed_node2 8218 1726776702.35276: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776702.35284: Calling all_plugins_play to load vars for managed_node2 8218 1726776702.35286: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776702.35288: Calling groups_plugins_play to load vars for managed_node2 8218 1726776702.35395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776702.35561: done with get_vars() 8218 1726776702.35569: done getting variables 8218 1726776702.35609: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:11:42 -0400 (0:00:00.034) 0:01:28.186 **** 8218 1726776702.35634: entering _queue_task() for managed_node2/reboot 8218 1726776702.35784: worker is 1 (out of 1 available) 8218 1726776702.35798: exiting _queue_task() for managed_node2/reboot 8218 1726776702.35810: done queuing things up, now waiting for results queue to drain 8218 1726776702.35812: waiting for pending results... 11388 1726776702.35941: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 11388 1726776702.36060: in run() - task 120fa90a-8a95-cec2-986e-000000000ed6 11388 1726776702.36078: variable 'ansible_search_path' from source: unknown 11388 1726776702.36083: variable 'ansible_search_path' from source: unknown 11388 1726776702.36109: calling self._execute() 11388 1726776702.36181: variable 'ansible_host' from source: host vars for 'managed_node2' 11388 1726776702.36190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11388 1726776702.36198: variable 'omit' from source: magic vars 11388 1726776702.36532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11388 1726776702.38052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11388 1726776702.38100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11388 1726776702.38127: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11388 1726776702.38157: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11388 1726776702.38179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11388 1726776702.38235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11388 1726776702.38256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11388 1726776702.38277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11388 1726776702.38304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11388 1726776702.38316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11388 1726776702.38393: variable '__kernel_settings_is_transactional' from source: set_fact 11388 1726776702.38410: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11388 1726776702.38415: when evaluation is False, skipping this task 11388 1726776702.38419: _execute() done 11388 1726776702.38422: dumping result to json 11388 1726776702.38426: done dumping result, returning 11388 1726776702.38434: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-cec2-986e-000000000ed6] 11388 1726776702.38440: sending task result for task 120fa90a-8a95-cec2-986e-000000000ed6 11388 1726776702.38462: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ed6 11388 1726776702.38469: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776702.38567: no more pending results, returning what we have 8218 1726776702.38570: results queue empty 8218 1726776702.38571: checking for any_errors_fatal 8218 1726776702.38578: done checking for any_errors_fatal 8218 1726776702.38579: checking for max_fail_percentage 8218 1726776702.38580: done checking for max_fail_percentage 8218 1726776702.38581: checking to see if all hosts have failed and the running result is not ok 8218 1726776702.38582: done checking to see if all hosts have failed 8218 1726776702.38582: getting the remaining hosts for this loop 8218 1726776702.38583: done getting the remaining hosts for this loop 8218 1726776702.38586: getting the next task for host managed_node2 8218 1726776702.38593: done getting next task for host managed_node2 8218 1726776702.38596: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8218 1726776702.38599: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776702.38614: getting variables 8218 1726776702.38615: in VariableManager get_vars() 8218 1726776702.38649: Calling all_inventory to load vars for managed_node2 8218 1726776702.38652: Calling groups_inventory to load vars for managed_node2 8218 1726776702.38654: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776702.38662: Calling all_plugins_play to load vars for managed_node2 8218 1726776702.38664: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776702.38667: Calling groups_plugins_play to load vars for managed_node2 8218 1726776702.38787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776702.38905: done with get_vars() 8218 1726776702.38913: done getting variables 8218 1726776702.38956: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:11:42 -0400 (0:00:00.033) 0:01:28.220 **** 8218 1726776702.38980: entering _queue_task() for managed_node2/fail 8218 1726776702.39135: worker is 1 (out of 1 available) 8218 1726776702.39150: exiting _queue_task() for managed_node2/fail 8218 1726776702.39161: done queuing things up, now waiting for results queue to drain 8218 1726776702.39162: waiting for pending results... 11389 1726776702.39292: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 11389 1726776702.39407: in run() - task 120fa90a-8a95-cec2-986e-000000000ed7 11389 1726776702.39425: variable 'ansible_search_path' from source: unknown 11389 1726776702.39431: variable 'ansible_search_path' from source: unknown 11389 1726776702.39455: calling self._execute() 11389 1726776702.39522: variable 'ansible_host' from source: host vars for 'managed_node2' 11389 1726776702.39530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11389 1726776702.39537: variable 'omit' from source: magic vars 11389 1726776702.39876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726776702.41435: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726776702.41482: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726776702.41519: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726776702.41553: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726776702.41574: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726776702.41627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726776702.41650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726776702.41670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726776702.41697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726776702.41708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726776702.41787: variable '__kernel_settings_is_transactional' from source: set_fact 11389 1726776702.41803: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11389 1726776702.41808: when evaluation is False, skipping this task 11389 1726776702.41811: _execute() done 11389 1726776702.41815: dumping result to json 11389 1726776702.41819: done dumping result, returning 11389 1726776702.41826: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-cec2-986e-000000000ed7] 11389 1726776702.41834: sending task result for task 120fa90a-8a95-cec2-986e-000000000ed7 11389 1726776702.41856: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ed7 11389 1726776702.41860: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8218 1726776702.41968: no more pending results, returning what we have 8218 1726776702.41970: results queue empty 8218 1726776702.41971: checking for any_errors_fatal 8218 1726776702.41977: done checking for any_errors_fatal 8218 1726776702.41978: checking for max_fail_percentage 8218 1726776702.41979: done checking for max_fail_percentage 8218 1726776702.41980: checking to see if all hosts have failed and the running result is not ok 8218 1726776702.41981: done checking to see if all hosts have failed 8218 1726776702.41981: getting the remaining hosts for this loop 8218 1726776702.41982: done getting the remaining hosts for this loop 8218 1726776702.41985: getting the next task for host managed_node2 8218 1726776702.41993: done getting next task for host managed_node2 8218 1726776702.41997: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8218 1726776702.41999: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776702.42016: getting variables 8218 1726776702.42017: in VariableManager get_vars() 8218 1726776702.42051: Calling all_inventory to load vars for managed_node2 8218 1726776702.42054: Calling groups_inventory to load vars for managed_node2 8218 1726776702.42056: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776702.42065: Calling all_plugins_play to load vars for managed_node2 8218 1726776702.42070: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776702.42072: Calling groups_plugins_play to load vars for managed_node2 8218 1726776702.42238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776702.42354: done with get_vars() 8218 1726776702.42362: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:11:42 -0400 (0:00:00.034) 0:01:28.254 **** 8218 1726776702.42421: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776702.42583: worker is 1 (out of 1 available) 8218 1726776702.42598: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776702.42610: done queuing things up, now waiting for results queue to drain 8218 1726776702.42613: waiting for pending results... 11390 1726776702.42737: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 11390 1726776702.42843: in run() - task 120fa90a-8a95-cec2-986e-000000000ed9 11390 1726776702.42858: variable 'ansible_search_path' from source: unknown 11390 1726776702.42861: variable 'ansible_search_path' from source: unknown 11390 1726776702.42885: calling self._execute() 11390 1726776702.42953: variable 'ansible_host' from source: host vars for 'managed_node2' 11390 1726776702.42960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11390 1726776702.42966: variable 'omit' from source: magic vars 11390 1726776702.43044: variable 'omit' from source: magic vars 11390 1726776702.43091: variable 'omit' from source: magic vars 11390 1726776702.43113: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11390 1726776702.43323: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11390 1726776702.43383: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11390 1726776702.43411: variable 'omit' from source: magic vars 11390 1726776702.43445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11390 1726776702.43469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11390 1726776702.43488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11390 1726776702.43501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11390 1726776702.43513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11390 1726776702.43537: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11390 1726776702.43542: variable 'ansible_host' from source: host vars for 'managed_node2' 11390 1726776702.43547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11390 1726776702.43612: Set connection var ansible_connection to ssh 11390 1726776702.43621: Set connection var ansible_pipelining to False 11390 1726776702.43627: Set connection var ansible_timeout to 10 11390 1726776702.43636: Set connection var ansible_module_compression to ZIP_DEFLATED 11390 1726776702.43641: Set connection var ansible_shell_type to sh 11390 1726776702.43646: Set connection var ansible_shell_executable to /bin/sh 11390 1726776702.43661: variable 'ansible_shell_executable' from source: unknown 11390 1726776702.43665: variable 'ansible_connection' from source: unknown 11390 1726776702.43669: variable 'ansible_module_compression' from source: unknown 11390 1726776702.43672: variable 'ansible_shell_type' from source: unknown 11390 1726776702.43675: variable 'ansible_shell_executable' from source: unknown 11390 1726776702.43678: variable 'ansible_host' from source: host vars for 'managed_node2' 11390 1726776702.43683: variable 'ansible_pipelining' from source: unknown 11390 1726776702.43686: variable 'ansible_timeout' from source: unknown 11390 1726776702.43690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11390 1726776702.43811: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11390 1726776702.43823: variable 'omit' from source: magic vars 11390 1726776702.43830: starting attempt loop 11390 1726776702.43834: running the handler 11390 1726776702.43846: _low_level_execute_command(): starting 11390 1726776702.43852: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11390 1726776702.46198: stdout chunk (state=2): >>>/root <<< 11390 1726776702.46323: stderr chunk (state=3): >>><<< 11390 1726776702.46332: stdout chunk (state=3): >>><<< 11390 1726776702.46351: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11390 1726776702.46365: _low_level_execute_command(): starting 11390 1726776702.46372: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568 `" && echo ansible-tmp-1726776702.463599-11390-247661893110568="` echo /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568 `" ) && sleep 0' 11390 1726776702.48978: stdout chunk (state=2): >>>ansible-tmp-1726776702.463599-11390-247661893110568=/root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568 <<< 11390 1726776702.49104: stderr chunk (state=3): >>><<< 11390 1726776702.49111: stdout chunk (state=3): >>><<< 11390 1726776702.49126: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776702.463599-11390-247661893110568=/root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568 , stderr= 11390 1726776702.49165: variable 'ansible_module_compression' from source: unknown 11390 1726776702.49201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11390 1726776702.49235: variable 'ansible_facts' from source: unknown 11390 1726776702.49300: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568/AnsiballZ_kernel_settings_get_config.py 11390 1726776702.49396: Sending initial data 11390 1726776702.49403: Sent initial data (173 bytes) 11390 1726776702.51867: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpko6gw6w5 /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568/AnsiballZ_kernel_settings_get_config.py <<< 11390 1726776702.52880: stderr chunk (state=3): >>><<< 11390 1726776702.52887: stdout chunk (state=3): >>><<< 11390 1726776702.52905: done transferring module to remote 11390 1726776702.52915: _low_level_execute_command(): starting 11390 1726776702.52919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568/ /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11390 1726776702.55257: stderr chunk (state=2): >>><<< 11390 1726776702.55264: stdout chunk (state=2): >>><<< 11390 1726776702.55281: _low_level_execute_command() done: rc=0, stdout=, stderr= 11390 1726776702.55285: _low_level_execute_command(): starting 11390 1726776702.55290: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11390 1726776702.71002: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 11390 1726776702.72074: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11390 1726776702.72120: stderr chunk (state=3): >>><<< 11390 1726776702.72127: stdout chunk (state=3): >>><<< 11390 1726776702.72145: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 11390 1726776702.72177: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11390 1726776702.72189: _low_level_execute_command(): starting 11390 1726776702.72195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776702.463599-11390-247661893110568/ > /dev/null 2>&1 && sleep 0' 11390 1726776702.74652: stderr chunk (state=2): >>><<< 11390 1726776702.74662: stdout chunk (state=2): >>><<< 11390 1726776702.74679: _low_level_execute_command() done: rc=0, stdout=, stderr= 11390 1726776702.74686: handler run complete 11390 1726776702.74701: attempt loop complete, returning result 11390 1726776702.74705: _execute() done 11390 1726776702.74708: dumping result to json 11390 1726776702.74713: done dumping result, returning 11390 1726776702.74720: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-cec2-986e-000000000ed9] 11390 1726776702.74727: sending task result for task 120fa90a-8a95-cec2-986e-000000000ed9 11390 1726776702.74761: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ed9 11390 1726776702.74768: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8218 1726776702.74917: no more pending results, returning what we have 8218 1726776702.74920: results queue empty 8218 1726776702.74921: checking for any_errors_fatal 8218 1726776702.74927: done checking for any_errors_fatal 8218 1726776702.74927: checking for max_fail_percentage 8218 1726776702.74931: done checking for max_fail_percentage 8218 1726776702.74932: checking to see if all hosts have failed and the running result is not ok 8218 1726776702.74933: done checking to see if all hosts have failed 8218 1726776702.74933: getting the remaining hosts for this loop 8218 1726776702.74934: done getting the remaining hosts for this loop 8218 1726776702.74937: getting the next task for host managed_node2 8218 1726776702.74944: done getting next task for host managed_node2 8218 1726776702.74947: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8218 1726776702.74950: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776702.74961: getting variables 8218 1726776702.74962: in VariableManager get_vars() 8218 1726776702.74995: Calling all_inventory to load vars for managed_node2 8218 1726776702.74998: Calling groups_inventory to load vars for managed_node2 8218 1726776702.75000: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776702.75008: Calling all_plugins_play to load vars for managed_node2 8218 1726776702.75010: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776702.75012: Calling groups_plugins_play to load vars for managed_node2 8218 1726776702.75130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776702.75250: done with get_vars() 8218 1726776702.75259: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:11:42 -0400 (0:00:00.329) 0:01:28.583 **** 8218 1726776702.75327: entering _queue_task() for managed_node2/stat 8218 1726776702.75494: worker is 1 (out of 1 available) 8218 1726776702.75509: exiting _queue_task() for managed_node2/stat 8218 1726776702.75520: done queuing things up, now waiting for results queue to drain 8218 1726776702.75522: waiting for pending results... 11398 1726776702.75660: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 11398 1726776702.75784: in run() - task 120fa90a-8a95-cec2-986e-000000000eda 11398 1726776702.75803: variable 'ansible_search_path' from source: unknown 11398 1726776702.75807: variable 'ansible_search_path' from source: unknown 11398 1726776702.75846: variable '__prof_from_conf' from source: task vars 11398 1726776702.76095: variable '__prof_from_conf' from source: task vars 11398 1726776702.76238: variable '__data' from source: task vars 11398 1726776702.76295: variable '__kernel_settings_register_tuned_main' from source: set_fact 11398 1726776702.76441: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11398 1726776702.76453: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11398 1726776702.76499: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11398 1726776702.76589: variable 'omit' from source: magic vars 11398 1726776702.76667: variable 'ansible_host' from source: host vars for 'managed_node2' 11398 1726776702.76678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11398 1726776702.76687: variable 'omit' from source: magic vars 11398 1726776702.76859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11398 1726776702.78370: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11398 1726776702.78425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11398 1726776702.78456: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11398 1726776702.78483: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11398 1726776702.78504: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11398 1726776702.78560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11398 1726776702.78583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11398 1726776702.78601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11398 1726776702.78630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11398 1726776702.78642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11398 1726776702.78715: variable 'item' from source: unknown 11398 1726776702.78730: Evaluated conditional (item | length > 0): False 11398 1726776702.78734: when evaluation is False, skipping this task 11398 1726776702.78758: variable 'item' from source: unknown 11398 1726776702.78805: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 11398 1726776702.78883: variable 'ansible_host' from source: host vars for 'managed_node2' 11398 1726776702.78893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11398 1726776702.78902: variable 'omit' from source: magic vars 11398 1726776702.79021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11398 1726776702.79042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11398 1726776702.79059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11398 1726776702.79087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11398 1726776702.79099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11398 1726776702.79155: variable 'item' from source: unknown 11398 1726776702.79164: Evaluated conditional (item | length > 0): True 11398 1726776702.79171: variable 'omit' from source: magic vars 11398 1726776702.79206: variable 'omit' from source: magic vars 11398 1726776702.79241: variable 'item' from source: unknown 11398 1726776702.79285: variable 'item' from source: unknown 11398 1726776702.79299: variable 'omit' from source: magic vars 11398 1726776702.79318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11398 1726776702.79340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11398 1726776702.79355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11398 1726776702.79369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11398 1726776702.79379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11398 1726776702.79400: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11398 1726776702.79404: variable 'ansible_host' from source: host vars for 'managed_node2' 11398 1726776702.79408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11398 1726776702.79473: Set connection var ansible_connection to ssh 11398 1726776702.79480: Set connection var ansible_pipelining to False 11398 1726776702.79487: Set connection var ansible_timeout to 10 11398 1726776702.79495: Set connection var ansible_module_compression to ZIP_DEFLATED 11398 1726776702.79500: Set connection var ansible_shell_type to sh 11398 1726776702.79505: Set connection var ansible_shell_executable to /bin/sh 11398 1726776702.79519: variable 'ansible_shell_executable' from source: unknown 11398 1726776702.79523: variable 'ansible_connection' from source: unknown 11398 1726776702.79527: variable 'ansible_module_compression' from source: unknown 11398 1726776702.79532: variable 'ansible_shell_type' from source: unknown 11398 1726776702.79535: variable 'ansible_shell_executable' from source: unknown 11398 1726776702.79539: variable 'ansible_host' from source: host vars for 'managed_node2' 11398 1726776702.79543: variable 'ansible_pipelining' from source: unknown 11398 1726776702.79546: variable 'ansible_timeout' from source: unknown 11398 1726776702.79550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11398 1726776702.79638: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11398 1726776702.79649: variable 'omit' from source: magic vars 11398 1726776702.79655: starting attempt loop 11398 1726776702.79659: running the handler 11398 1726776702.79670: _low_level_execute_command(): starting 11398 1726776702.79677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11398 1726776702.82004: stdout chunk (state=2): >>>/root <<< 11398 1726776702.82131: stderr chunk (state=3): >>><<< 11398 1726776702.82138: stdout chunk (state=3): >>><<< 11398 1726776702.82156: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11398 1726776702.82169: _low_level_execute_command(): starting 11398 1726776702.82175: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846 `" && echo ansible-tmp-1726776702.8216343-11398-92697035087846="` echo /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846 `" ) && sleep 0' 11398 1726776702.85024: stdout chunk (state=2): >>>ansible-tmp-1726776702.8216343-11398-92697035087846=/root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846 <<< 11398 1726776702.85155: stderr chunk (state=3): >>><<< 11398 1726776702.85162: stdout chunk (state=3): >>><<< 11398 1726776702.85179: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776702.8216343-11398-92697035087846=/root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846 , stderr= 11398 1726776702.85214: variable 'ansible_module_compression' from source: unknown 11398 1726776702.85255: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11398 1726776702.85287: variable 'ansible_facts' from source: unknown 11398 1726776702.85353: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846/AnsiballZ_stat.py 11398 1726776702.85452: Sending initial data 11398 1726776702.85459: Sent initial data (151 bytes) 11398 1726776702.88032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpkqlt1hae /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846/AnsiballZ_stat.py <<< 11398 1726776702.89094: stderr chunk (state=3): >>><<< 11398 1726776702.89101: stdout chunk (state=3): >>><<< 11398 1726776702.89120: done transferring module to remote 11398 1726776702.89132: _low_level_execute_command(): starting 11398 1726776702.89138: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846/ /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846/AnsiballZ_stat.py && sleep 0' 11398 1726776702.91575: stderr chunk (state=2): >>><<< 11398 1726776702.91583: stdout chunk (state=2): >>><<< 11398 1726776702.91597: _low_level_execute_command() done: rc=0, stdout=, stderr= 11398 1726776702.91601: _low_level_execute_command(): starting 11398 1726776702.91607: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846/AnsiballZ_stat.py && sleep 0' 11398 1726776703.06938: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11398 1726776703.08009: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11398 1726776703.08059: stderr chunk (state=3): >>><<< 11398 1726776703.08068: stdout chunk (state=3): >>><<< 11398 1726776703.08084: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 11398 1726776703.08106: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11398 1726776703.08116: _low_level_execute_command(): starting 11398 1726776703.08122: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776702.8216343-11398-92697035087846/ > /dev/null 2>&1 && sleep 0' 11398 1726776703.10567: stderr chunk (state=2): >>><<< 11398 1726776703.10574: stdout chunk (state=2): >>><<< 11398 1726776703.10587: _low_level_execute_command() done: rc=0, stdout=, stderr= 11398 1726776703.10596: handler run complete 11398 1726776703.10611: attempt loop complete, returning result 11398 1726776703.10627: variable 'item' from source: unknown 11398 1726776703.10691: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 11398 1726776703.10781: variable 'ansible_host' from source: host vars for 'managed_node2' 11398 1726776703.10791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11398 1726776703.10800: variable 'omit' from source: magic vars 11398 1726776703.10911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11398 1726776703.10937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11398 1726776703.10956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11398 1726776703.10985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11398 1726776703.10997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11398 1726776703.11057: variable 'item' from source: unknown 11398 1726776703.11068: Evaluated conditional (item | length > 0): True 11398 1726776703.11073: variable 'omit' from source: magic vars 11398 1726776703.11085: variable 'omit' from source: magic vars 11398 1726776703.11115: variable 'item' from source: unknown 11398 1726776703.11160: variable 'item' from source: unknown 11398 1726776703.11175: variable 'omit' from source: magic vars 11398 1726776703.11192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11398 1726776703.11200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11398 1726776703.11206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11398 1726776703.11218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11398 1726776703.11222: variable 'ansible_host' from source: host vars for 'managed_node2' 11398 1726776703.11226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11398 1726776703.11277: Set connection var ansible_connection to ssh 11398 1726776703.11284: Set connection var ansible_pipelining to False 11398 1726776703.11290: Set connection var ansible_timeout to 10 11398 1726776703.11297: Set connection var ansible_module_compression to ZIP_DEFLATED 11398 1726776703.11302: Set connection var ansible_shell_type to sh 11398 1726776703.11307: Set connection var ansible_shell_executable to /bin/sh 11398 1726776703.11321: variable 'ansible_shell_executable' from source: unknown 11398 1726776703.11324: variable 'ansible_connection' from source: unknown 11398 1726776703.11327: variable 'ansible_module_compression' from source: unknown 11398 1726776703.11332: variable 'ansible_shell_type' from source: unknown 11398 1726776703.11335: variable 'ansible_shell_executable' from source: unknown 11398 1726776703.11339: variable 'ansible_host' from source: host vars for 'managed_node2' 11398 1726776703.11343: variable 'ansible_pipelining' from source: unknown 11398 1726776703.11346: variable 'ansible_timeout' from source: unknown 11398 1726776703.11350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11398 1726776703.11415: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11398 1726776703.11426: variable 'omit' from source: magic vars 11398 1726776703.11434: starting attempt loop 11398 1726776703.11437: running the handler 11398 1726776703.11444: _low_level_execute_command(): starting 11398 1726776703.11448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11398 1726776703.13669: stdout chunk (state=2): >>>/root <<< 11398 1726776703.13789: stderr chunk (state=3): >>><<< 11398 1726776703.13795: stdout chunk (state=3): >>><<< 11398 1726776703.13809: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11398 1726776703.13819: _low_level_execute_command(): starting 11398 1726776703.13825: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446 `" && echo ansible-tmp-1726776703.1381586-11398-83311879027446="` echo /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446 `" ) && sleep 0' 11398 1726776703.16371: stdout chunk (state=2): >>>ansible-tmp-1726776703.1381586-11398-83311879027446=/root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446 <<< 11398 1726776703.16497: stderr chunk (state=3): >>><<< 11398 1726776703.16503: stdout chunk (state=3): >>><<< 11398 1726776703.16515: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776703.1381586-11398-83311879027446=/root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446 , stderr= 11398 1726776703.16544: variable 'ansible_module_compression' from source: unknown 11398 1726776703.16579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11398 1726776703.16598: variable 'ansible_facts' from source: unknown 11398 1726776703.16653: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446/AnsiballZ_stat.py 11398 1726776703.16736: Sending initial data 11398 1726776703.16743: Sent initial data (151 bytes) 11398 1726776703.19233: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp9gjnws1r /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446/AnsiballZ_stat.py <<< 11398 1726776703.20272: stderr chunk (state=3): >>><<< 11398 1726776703.20279: stdout chunk (state=3): >>><<< 11398 1726776703.20297: done transferring module to remote 11398 1726776703.20306: _low_level_execute_command(): starting 11398 1726776703.20310: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446/ /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446/AnsiballZ_stat.py && sleep 0' 11398 1726776703.22613: stderr chunk (state=2): >>><<< 11398 1726776703.22622: stdout chunk (state=2): >>><<< 11398 1726776703.22638: _low_level_execute_command() done: rc=0, stdout=, stderr= 11398 1726776703.22642: _low_level_execute_command(): starting 11398 1726776703.22647: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446/AnsiballZ_stat.py && sleep 0' 11398 1726776703.38588: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11398 1726776703.39790: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11398 1726776703.39842: stderr chunk (state=3): >>><<< 11398 1726776703.39848: stdout chunk (state=3): >>><<< 11398 1726776703.39864: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726776634.1489303, "mtime": 1726776632.1399238, "ctime": 1726776632.1399238, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.12.75 closed. 11398 1726776703.40145: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11398 1726776703.40155: _low_level_execute_command(): starting 11398 1726776703.40161: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776703.1381586-11398-83311879027446/ > /dev/null 2>&1 && sleep 0' 11398 1726776703.42597: stderr chunk (state=2): >>><<< 11398 1726776703.42605: stdout chunk (state=2): >>><<< 11398 1726776703.42618: _low_level_execute_command() done: rc=0, stdout=, stderr= 11398 1726776703.42624: handler run complete 11398 1726776703.42654: attempt loop complete, returning result 11398 1726776703.42671: variable 'item' from source: unknown 11398 1726776703.42732: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776634.1489303, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726776632.1399238, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726776632.1399238, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11398 1726776703.42776: dumping result to json 11398 1726776703.42786: done dumping result, returning 11398 1726776703.42794: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-cec2-986e-000000000eda] 11398 1726776703.42799: sending task result for task 120fa90a-8a95-cec2-986e-000000000eda 11398 1726776703.42840: done sending task result for task 120fa90a-8a95-cec2-986e-000000000eda 11398 1726776703.42844: WORKER PROCESS EXITING 8218 1726776703.43095: no more pending results, returning what we have 8218 1726776703.43098: results queue empty 8218 1726776703.43099: checking for any_errors_fatal 8218 1726776703.43103: done checking for any_errors_fatal 8218 1726776703.43104: checking for max_fail_percentage 8218 1726776703.43105: done checking for max_fail_percentage 8218 1726776703.43106: checking to see if all hosts have failed and the running result is not ok 8218 1726776703.43106: done checking to see if all hosts have failed 8218 1726776703.43107: getting the remaining hosts for this loop 8218 1726776703.43108: done getting the remaining hosts for this loop 8218 1726776703.43110: getting the next task for host managed_node2 8218 1726776703.43116: done getting next task for host managed_node2 8218 1726776703.43119: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8218 1726776703.43122: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776703.43133: getting variables 8218 1726776703.43134: in VariableManager get_vars() 8218 1726776703.43156: Calling all_inventory to load vars for managed_node2 8218 1726776703.43158: Calling groups_inventory to load vars for managed_node2 8218 1726776703.43159: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776703.43166: Calling all_plugins_play to load vars for managed_node2 8218 1726776703.43168: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776703.43170: Calling groups_plugins_play to load vars for managed_node2 8218 1726776703.43264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776703.43379: done with get_vars() 8218 1726776703.43387: done getting variables 8218 1726776703.43430: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:11:43 -0400 (0:00:00.681) 0:01:29.265 **** 8218 1726776703.43453: entering _queue_task() for managed_node2/set_fact 8218 1726776703.43611: worker is 1 (out of 1 available) 8218 1726776703.43626: exiting _queue_task() for managed_node2/set_fact 8218 1726776703.43639: done queuing things up, now waiting for results queue to drain 8218 1726776703.43640: waiting for pending results... 11416 1726776703.43771: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 11416 1726776703.43888: in run() - task 120fa90a-8a95-cec2-986e-000000000edb 11416 1726776703.43905: variable 'ansible_search_path' from source: unknown 11416 1726776703.43908: variable 'ansible_search_path' from source: unknown 11416 1726776703.43936: calling self._execute() 11416 1726776703.44004: variable 'ansible_host' from source: host vars for 'managed_node2' 11416 1726776703.44013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11416 1726776703.44021: variable 'omit' from source: magic vars 11416 1726776703.44100: variable 'omit' from source: magic vars 11416 1726776703.44140: variable 'omit' from source: magic vars 11416 1726776703.44454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11416 1726776703.45991: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11416 1726776703.46038: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11416 1726776703.46068: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11416 1726776703.46343: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11416 1726776703.46368: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11416 1726776703.46421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11416 1726776703.46444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11416 1726776703.46462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11416 1726776703.46493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11416 1726776703.46504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11416 1726776703.46538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11416 1726776703.46555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11416 1726776703.46576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11416 1726776703.46602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11416 1726776703.46617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11416 1726776703.46660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11416 1726776703.46680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11416 1726776703.46698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11416 1726776703.46723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11416 1726776703.46736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11416 1726776703.46884: variable '__kernel_settings_find_profile_dirs' from source: set_fact 11416 1726776703.46949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11416 1726776703.47059: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11416 1726776703.47089: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11416 1726776703.47112: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11416 1726776703.47135: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11416 1726776703.47164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11416 1726776703.47182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11416 1726776703.47200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11416 1726776703.47217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11416 1726776703.47257: variable 'omit' from source: magic vars 11416 1726776703.47280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11416 1726776703.47300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11416 1726776703.47315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11416 1726776703.47330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11416 1726776703.47340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11416 1726776703.47362: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11416 1726776703.47369: variable 'ansible_host' from source: host vars for 'managed_node2' 11416 1726776703.47374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11416 1726776703.47441: Set connection var ansible_connection to ssh 11416 1726776703.47449: Set connection var ansible_pipelining to False 11416 1726776703.47456: Set connection var ansible_timeout to 10 11416 1726776703.47463: Set connection var ansible_module_compression to ZIP_DEFLATED 11416 1726776703.47470: Set connection var ansible_shell_type to sh 11416 1726776703.47476: Set connection var ansible_shell_executable to /bin/sh 11416 1726776703.47493: variable 'ansible_shell_executable' from source: unknown 11416 1726776703.47497: variable 'ansible_connection' from source: unknown 11416 1726776703.47500: variable 'ansible_module_compression' from source: unknown 11416 1726776703.47503: variable 'ansible_shell_type' from source: unknown 11416 1726776703.47507: variable 'ansible_shell_executable' from source: unknown 11416 1726776703.47510: variable 'ansible_host' from source: host vars for 'managed_node2' 11416 1726776703.47514: variable 'ansible_pipelining' from source: unknown 11416 1726776703.47517: variable 'ansible_timeout' from source: unknown 11416 1726776703.47522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11416 1726776703.47588: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11416 1726776703.47599: variable 'omit' from source: magic vars 11416 1726776703.47605: starting attempt loop 11416 1726776703.47608: running the handler 11416 1726776703.47618: handler run complete 11416 1726776703.47626: attempt loop complete, returning result 11416 1726776703.47630: _execute() done 11416 1726776703.47634: dumping result to json 11416 1726776703.47638: done dumping result, returning 11416 1726776703.47645: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-cec2-986e-000000000edb] 11416 1726776703.47650: sending task result for task 120fa90a-8a95-cec2-986e-000000000edb 11416 1726776703.47673: done sending task result for task 120fa90a-8a95-cec2-986e-000000000edb 11416 1726776703.47676: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8218 1726776703.47818: no more pending results, returning what we have 8218 1726776703.47822: results queue empty 8218 1726776703.47823: checking for any_errors_fatal 8218 1726776703.47838: done checking for any_errors_fatal 8218 1726776703.47838: checking for max_fail_percentage 8218 1726776703.47840: done checking for max_fail_percentage 8218 1726776703.47840: checking to see if all hosts have failed and the running result is not ok 8218 1726776703.47841: done checking to see if all hosts have failed 8218 1726776703.47842: getting the remaining hosts for this loop 8218 1726776703.47843: done getting the remaining hosts for this loop 8218 1726776703.47845: getting the next task for host managed_node2 8218 1726776703.47851: done getting next task for host managed_node2 8218 1726776703.47855: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8218 1726776703.47858: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776703.47868: getting variables 8218 1726776703.47869: in VariableManager get_vars() 8218 1726776703.47902: Calling all_inventory to load vars for managed_node2 8218 1726776703.47905: Calling groups_inventory to load vars for managed_node2 8218 1726776703.47907: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776703.47915: Calling all_plugins_play to load vars for managed_node2 8218 1726776703.47917: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776703.47919: Calling groups_plugins_play to load vars for managed_node2 8218 1726776703.48031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776703.48179: done with get_vars() 8218 1726776703.48186: done getting variables 8218 1726776703.48227: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:11:43 -0400 (0:00:00.047) 0:01:29.313 **** 8218 1726776703.48252: entering _queue_task() for managed_node2/service 8218 1726776703.48411: worker is 1 (out of 1 available) 8218 1726776703.48426: exiting _queue_task() for managed_node2/service 8218 1726776703.48439: done queuing things up, now waiting for results queue to drain 8218 1726776703.48440: waiting for pending results... 11417 1726776703.48572: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 11417 1726776703.48691: in run() - task 120fa90a-8a95-cec2-986e-000000000edc 11417 1726776703.48707: variable 'ansible_search_path' from source: unknown 11417 1726776703.48711: variable 'ansible_search_path' from source: unknown 11417 1726776703.48744: variable '__kernel_settings_services' from source: include_vars 11417 1726776703.48983: variable '__kernel_settings_services' from source: include_vars 11417 1726776703.49041: variable 'omit' from source: magic vars 11417 1726776703.49127: variable 'ansible_host' from source: host vars for 'managed_node2' 11417 1726776703.49142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11417 1726776703.49151: variable 'omit' from source: magic vars 11417 1726776703.49204: variable 'omit' from source: magic vars 11417 1726776703.49241: variable 'omit' from source: magic vars 11417 1726776703.49274: variable 'item' from source: unknown 11417 1726776703.49333: variable 'item' from source: unknown 11417 1726776703.49352: variable 'omit' from source: magic vars 11417 1726776703.49383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11417 1726776703.49408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11417 1726776703.49427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11417 1726776703.49442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11417 1726776703.49453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11417 1726776703.49475: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11417 1726776703.49481: variable 'ansible_host' from source: host vars for 'managed_node2' 11417 1726776703.49485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11417 1726776703.49551: Set connection var ansible_connection to ssh 11417 1726776703.49559: Set connection var ansible_pipelining to False 11417 1726776703.49565: Set connection var ansible_timeout to 10 11417 1726776703.49573: Set connection var ansible_module_compression to ZIP_DEFLATED 11417 1726776703.49578: Set connection var ansible_shell_type to sh 11417 1726776703.49584: Set connection var ansible_shell_executable to /bin/sh 11417 1726776703.49597: variable 'ansible_shell_executable' from source: unknown 11417 1726776703.49601: variable 'ansible_connection' from source: unknown 11417 1726776703.49605: variable 'ansible_module_compression' from source: unknown 11417 1726776703.49609: variable 'ansible_shell_type' from source: unknown 11417 1726776703.49612: variable 'ansible_shell_executable' from source: unknown 11417 1726776703.49616: variable 'ansible_host' from source: host vars for 'managed_node2' 11417 1726776703.49620: variable 'ansible_pipelining' from source: unknown 11417 1726776703.49623: variable 'ansible_timeout' from source: unknown 11417 1726776703.49627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11417 1726776703.49714: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11417 1726776703.49726: variable 'omit' from source: magic vars 11417 1726776703.49733: starting attempt loop 11417 1726776703.49737: running the handler 11417 1726776703.49799: variable 'ansible_facts' from source: unknown 11417 1726776703.49881: _low_level_execute_command(): starting 11417 1726776703.49890: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11417 1726776703.52239: stdout chunk (state=2): >>>/root <<< 11417 1726776703.52359: stderr chunk (state=3): >>><<< 11417 1726776703.52365: stdout chunk (state=3): >>><<< 11417 1726776703.52382: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11417 1726776703.52395: _low_level_execute_command(): starting 11417 1726776703.52400: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463 `" && echo ansible-tmp-1726776703.523899-11417-24572932170463="` echo /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463 `" ) && sleep 0' 11417 1726776703.55214: stdout chunk (state=2): >>>ansible-tmp-1726776703.523899-11417-24572932170463=/root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463 <<< 11417 1726776703.55340: stderr chunk (state=3): >>><<< 11417 1726776703.55346: stdout chunk (state=3): >>><<< 11417 1726776703.55359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776703.523899-11417-24572932170463=/root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463 , stderr= 11417 1726776703.55382: variable 'ansible_module_compression' from source: unknown 11417 1726776703.55424: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11417 1726776703.55472: variable 'ansible_facts' from source: unknown 11417 1726776703.55625: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463/AnsiballZ_systemd.py 11417 1726776703.55723: Sending initial data 11417 1726776703.55732: Sent initial data (153 bytes) 11417 1726776703.58194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpfeeahtkh /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463/AnsiballZ_systemd.py <<< 11417 1726776703.60119: stderr chunk (state=3): >>><<< 11417 1726776703.60126: stdout chunk (state=3): >>><<< 11417 1726776703.60151: done transferring module to remote 11417 1726776703.60162: _low_level_execute_command(): starting 11417 1726776703.60169: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463/ /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463/AnsiballZ_systemd.py && sleep 0' 11417 1726776703.62511: stderr chunk (state=2): >>><<< 11417 1726776703.62518: stdout chunk (state=2): >>><<< 11417 1726776703.62534: _low_level_execute_command() done: rc=0, stdout=, stderr= 11417 1726776703.62538: _low_level_execute_command(): starting 11417 1726776703.62543: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463/AnsiballZ_systemd.py && sleep 0' 11417 1726776703.90353: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "22953984", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11417 1726776703.90395: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11417 1726776703.91985: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11417 1726776703.92033: stderr chunk (state=3): >>><<< 11417 1726776703.92039: stdout chunk (state=3): >>><<< 11417 1726776703.92059: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "22953984", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11417 1726776703.92189: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11417 1726776703.92209: _low_level_execute_command(): starting 11417 1726776703.92215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776703.523899-11417-24572932170463/ > /dev/null 2>&1 && sleep 0' 11417 1726776703.94593: stderr chunk (state=2): >>><<< 11417 1726776703.94600: stdout chunk (state=2): >>><<< 11417 1726776703.94614: _low_level_execute_command() done: rc=0, stdout=, stderr= 11417 1726776703.94622: handler run complete 11417 1726776703.94657: attempt loop complete, returning result 11417 1726776703.94675: variable 'item' from source: unknown 11417 1726776703.94737: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "659", "MemoryAccounting": "yes", "MemoryCurrent": "22953984", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "WatchdogUSec": "0" } } 11417 1726776703.94830: dumping result to json 11417 1726776703.94848: done dumping result, returning 11417 1726776703.94856: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-cec2-986e-000000000edc] 11417 1726776703.94863: sending task result for task 120fa90a-8a95-cec2-986e-000000000edc 11417 1726776703.94970: done sending task result for task 120fa90a-8a95-cec2-986e-000000000edc 11417 1726776703.94974: WORKER PROCESS EXITING 8218 1726776703.95312: no more pending results, returning what we have 8218 1726776703.95314: results queue empty 8218 1726776703.95315: checking for any_errors_fatal 8218 1726776703.95318: done checking for any_errors_fatal 8218 1726776703.95318: checking for max_fail_percentage 8218 1726776703.95320: done checking for max_fail_percentage 8218 1726776703.95320: checking to see if all hosts have failed and the running result is not ok 8218 1726776703.95321: done checking to see if all hosts have failed 8218 1726776703.95321: getting the remaining hosts for this loop 8218 1726776703.95322: done getting the remaining hosts for this loop 8218 1726776703.95324: getting the next task for host managed_node2 8218 1726776703.95330: done getting next task for host managed_node2 8218 1726776703.95333: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8218 1726776703.95335: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776703.95342: getting variables 8218 1726776703.95343: in VariableManager get_vars() 8218 1726776703.95366: Calling all_inventory to load vars for managed_node2 8218 1726776703.95368: Calling groups_inventory to load vars for managed_node2 8218 1726776703.95369: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776703.95375: Calling all_plugins_play to load vars for managed_node2 8218 1726776703.95377: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776703.95379: Calling groups_plugins_play to load vars for managed_node2 8218 1726776703.95493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776703.95609: done with get_vars() 8218 1726776703.95618: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:11:43 -0400 (0:00:00.474) 0:01:29.787 **** 8218 1726776703.95686: entering _queue_task() for managed_node2/file 8218 1726776703.95846: worker is 1 (out of 1 available) 8218 1726776703.95860: exiting _queue_task() for managed_node2/file 8218 1726776703.95871: done queuing things up, now waiting for results queue to drain 8218 1726776703.95874: waiting for pending results... 11425 1726776703.96001: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 11425 1726776703.96121: in run() - task 120fa90a-8a95-cec2-986e-000000000edd 11425 1726776703.96140: variable 'ansible_search_path' from source: unknown 11425 1726776703.96144: variable 'ansible_search_path' from source: unknown 11425 1726776703.96173: calling self._execute() 11425 1726776703.96260: variable 'ansible_host' from source: host vars for 'managed_node2' 11425 1726776703.96272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11425 1726776703.96283: variable 'omit' from source: magic vars 11425 1726776703.96386: variable 'omit' from source: magic vars 11425 1726776703.96443: variable 'omit' from source: magic vars 11425 1726776703.96472: variable '__kernel_settings_profile_dir' from source: role '' all vars 11425 1726776703.96759: variable '__kernel_settings_profile_dir' from source: role '' all vars 11425 1726776703.96879: variable '__kernel_settings_profile_parent' from source: set_fact 11425 1726776703.96889: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11425 1726776703.96930: variable 'omit' from source: magic vars 11425 1726776703.96970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11425 1726776703.97001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11425 1726776703.97024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11425 1726776703.97043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11425 1726776703.97054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11425 1726776703.97085: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11425 1726776703.97092: variable 'ansible_host' from source: host vars for 'managed_node2' 11425 1726776703.97096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11425 1726776703.97193: Set connection var ansible_connection to ssh 11425 1726776703.97203: Set connection var ansible_pipelining to False 11425 1726776703.97210: Set connection var ansible_timeout to 10 11425 1726776703.97218: Set connection var ansible_module_compression to ZIP_DEFLATED 11425 1726776703.97223: Set connection var ansible_shell_type to sh 11425 1726776703.97227: Set connection var ansible_shell_executable to /bin/sh 11425 1726776703.97247: variable 'ansible_shell_executable' from source: unknown 11425 1726776703.97251: variable 'ansible_connection' from source: unknown 11425 1726776703.97255: variable 'ansible_module_compression' from source: unknown 11425 1726776703.97258: variable 'ansible_shell_type' from source: unknown 11425 1726776703.97261: variable 'ansible_shell_executable' from source: unknown 11425 1726776703.97265: variable 'ansible_host' from source: host vars for 'managed_node2' 11425 1726776703.97271: variable 'ansible_pipelining' from source: unknown 11425 1726776703.97275: variable 'ansible_timeout' from source: unknown 11425 1726776703.97278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11425 1726776703.97409: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11425 1726776703.97419: variable 'omit' from source: magic vars 11425 1726776703.97424: starting attempt loop 11425 1726776703.97426: running the handler 11425 1726776703.97439: _low_level_execute_command(): starting 11425 1726776703.97447: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11425 1726776703.99735: stdout chunk (state=2): >>>/root <<< 11425 1726776703.99850: stderr chunk (state=3): >>><<< 11425 1726776703.99856: stdout chunk (state=3): >>><<< 11425 1726776703.99873: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11425 1726776703.99885: _low_level_execute_command(): starting 11425 1726776703.99891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893 `" && echo ansible-tmp-1726776703.9988077-11425-50946755817893="` echo /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893 `" ) && sleep 0' 11425 1726776704.02406: stdout chunk (state=2): >>>ansible-tmp-1726776703.9988077-11425-50946755817893=/root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893 <<< 11425 1726776704.02532: stderr chunk (state=3): >>><<< 11425 1726776704.02538: stdout chunk (state=3): >>><<< 11425 1726776704.02551: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776703.9988077-11425-50946755817893=/root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893 , stderr= 11425 1726776704.02583: variable 'ansible_module_compression' from source: unknown 11425 1726776704.02623: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11425 1726776704.02656: variable 'ansible_facts' from source: unknown 11425 1726776704.02725: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893/AnsiballZ_file.py 11425 1726776704.02820: Sending initial data 11425 1726776704.02826: Sent initial data (151 bytes) 11425 1726776704.05278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpdg6fc_j8 /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893/AnsiballZ_file.py <<< 11425 1726776704.06351: stderr chunk (state=3): >>><<< 11425 1726776704.06357: stdout chunk (state=3): >>><<< 11425 1726776704.06375: done transferring module to remote 11425 1726776704.06385: _low_level_execute_command(): starting 11425 1726776704.06390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893/ /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893/AnsiballZ_file.py && sleep 0' 11425 1726776704.08684: stderr chunk (state=2): >>><<< 11425 1726776704.08691: stdout chunk (state=2): >>><<< 11425 1726776704.08703: _low_level_execute_command() done: rc=0, stdout=, stderr= 11425 1726776704.08707: _low_level_execute_command(): starting 11425 1726776704.08712: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893/AnsiballZ_file.py && sleep 0' 11425 1726776704.25148: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11425 1726776704.26283: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11425 1726776704.26333: stderr chunk (state=3): >>><<< 11425 1726776704.26340: stdout chunk (state=3): >>><<< 11425 1726776704.26358: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11425 1726776704.26390: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11425 1726776704.26402: _low_level_execute_command(): starting 11425 1726776704.26407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776703.9988077-11425-50946755817893/ > /dev/null 2>&1 && sleep 0' 11425 1726776704.28806: stderr chunk (state=2): >>><<< 11425 1726776704.28813: stdout chunk (state=2): >>><<< 11425 1726776704.28827: _low_level_execute_command() done: rc=0, stdout=, stderr= 11425 1726776704.28836: handler run complete 11425 1726776704.28854: attempt loop complete, returning result 11425 1726776704.28858: _execute() done 11425 1726776704.28861: dumping result to json 11425 1726776704.28866: done dumping result, returning 11425 1726776704.28874: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-cec2-986e-000000000edd] 11425 1726776704.28880: sending task result for task 120fa90a-8a95-cec2-986e-000000000edd 11425 1726776704.28912: done sending task result for task 120fa90a-8a95-cec2-986e-000000000edd 11425 1726776704.28916: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8218 1726776704.29075: no more pending results, returning what we have 8218 1726776704.29078: results queue empty 8218 1726776704.29079: checking for any_errors_fatal 8218 1726776704.29097: done checking for any_errors_fatal 8218 1726776704.29098: checking for max_fail_percentage 8218 1726776704.29099: done checking for max_fail_percentage 8218 1726776704.29100: checking to see if all hosts have failed and the running result is not ok 8218 1726776704.29101: done checking to see if all hosts have failed 8218 1726776704.29101: getting the remaining hosts for this loop 8218 1726776704.29102: done getting the remaining hosts for this loop 8218 1726776704.29105: getting the next task for host managed_node2 8218 1726776704.29111: done getting next task for host managed_node2 8218 1726776704.29114: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8218 1726776704.29118: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776704.29130: getting variables 8218 1726776704.29132: in VariableManager get_vars() 8218 1726776704.29168: Calling all_inventory to load vars for managed_node2 8218 1726776704.29171: Calling groups_inventory to load vars for managed_node2 8218 1726776704.29173: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776704.29180: Calling all_plugins_play to load vars for managed_node2 8218 1726776704.29181: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776704.29183: Calling groups_plugins_play to load vars for managed_node2 8218 1726776704.29288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776704.29409: done with get_vars() 8218 1726776704.29418: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:11:44 -0400 (0:00:00.338) 0:01:30.125 **** 8218 1726776704.29490: entering _queue_task() for managed_node2/slurp 8218 1726776704.29647: worker is 1 (out of 1 available) 8218 1726776704.29660: exiting _queue_task() for managed_node2/slurp 8218 1726776704.29675: done queuing things up, now waiting for results queue to drain 8218 1726776704.29677: waiting for pending results... 11436 1726776704.29803: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 11436 1726776704.29924: in run() - task 120fa90a-8a95-cec2-986e-000000000ede 11436 1726776704.29943: variable 'ansible_search_path' from source: unknown 11436 1726776704.29947: variable 'ansible_search_path' from source: unknown 11436 1726776704.29975: calling self._execute() 11436 1726776704.30045: variable 'ansible_host' from source: host vars for 'managed_node2' 11436 1726776704.30055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11436 1726776704.30064: variable 'omit' from source: magic vars 11436 1726776704.30141: variable 'omit' from source: magic vars 11436 1726776704.30181: variable 'omit' from source: magic vars 11436 1726776704.30201: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11436 1726776704.30410: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11436 1726776704.30471: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11436 1726776704.30498: variable 'omit' from source: magic vars 11436 1726776704.30532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11436 1726776704.30557: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11436 1726776704.30576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11436 1726776704.30589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11436 1726776704.30600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11436 1726776704.30623: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11436 1726776704.30629: variable 'ansible_host' from source: host vars for 'managed_node2' 11436 1726776704.30634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11436 1726776704.30698: Set connection var ansible_connection to ssh 11436 1726776704.30706: Set connection var ansible_pipelining to False 11436 1726776704.30713: Set connection var ansible_timeout to 10 11436 1726776704.30720: Set connection var ansible_module_compression to ZIP_DEFLATED 11436 1726776704.30725: Set connection var ansible_shell_type to sh 11436 1726776704.30732: Set connection var ansible_shell_executable to /bin/sh 11436 1726776704.30746: variable 'ansible_shell_executable' from source: unknown 11436 1726776704.30750: variable 'ansible_connection' from source: unknown 11436 1726776704.30753: variable 'ansible_module_compression' from source: unknown 11436 1726776704.30756: variable 'ansible_shell_type' from source: unknown 11436 1726776704.30760: variable 'ansible_shell_executable' from source: unknown 11436 1726776704.30763: variable 'ansible_host' from source: host vars for 'managed_node2' 11436 1726776704.30767: variable 'ansible_pipelining' from source: unknown 11436 1726776704.30771: variable 'ansible_timeout' from source: unknown 11436 1726776704.30774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11436 1726776704.30912: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11436 1726776704.30923: variable 'omit' from source: magic vars 11436 1726776704.30930: starting attempt loop 11436 1726776704.30934: running the handler 11436 1726776704.30946: _low_level_execute_command(): starting 11436 1726776704.30953: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11436 1726776704.33230: stdout chunk (state=2): >>>/root <<< 11436 1726776704.33349: stderr chunk (state=3): >>><<< 11436 1726776704.33356: stdout chunk (state=3): >>><<< 11436 1726776704.33376: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11436 1726776704.33389: _low_level_execute_command(): starting 11436 1726776704.33396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962 `" && echo ansible-tmp-1726776704.3338428-11436-227307433901962="` echo /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962 `" ) && sleep 0' 11436 1726776704.35987: stdout chunk (state=2): >>>ansible-tmp-1726776704.3338428-11436-227307433901962=/root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962 <<< 11436 1726776704.36118: stderr chunk (state=3): >>><<< 11436 1726776704.36125: stdout chunk (state=3): >>><<< 11436 1726776704.36143: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776704.3338428-11436-227307433901962=/root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962 , stderr= 11436 1726776704.36182: variable 'ansible_module_compression' from source: unknown 11436 1726776704.36217: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 11436 1726776704.36251: variable 'ansible_facts' from source: unknown 11436 1726776704.36323: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962/AnsiballZ_slurp.py 11436 1726776704.36859: Sending initial data 11436 1726776704.36870: Sent initial data (153 bytes) 11436 1726776704.39084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp4cjk5d1k /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962/AnsiballZ_slurp.py <<< 11436 1726776704.40122: stderr chunk (state=3): >>><<< 11436 1726776704.40130: stdout chunk (state=3): >>><<< 11436 1726776704.40148: done transferring module to remote 11436 1726776704.40159: _low_level_execute_command(): starting 11436 1726776704.40164: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962/ /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962/AnsiballZ_slurp.py && sleep 0' 11436 1726776704.42532: stderr chunk (state=2): >>><<< 11436 1726776704.42539: stdout chunk (state=2): >>><<< 11436 1726776704.42551: _low_level_execute_command() done: rc=0, stdout=, stderr= 11436 1726776704.42555: _low_level_execute_command(): starting 11436 1726776704.42560: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962/AnsiballZ_slurp.py && sleep 0' 11436 1726776704.57559: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11436 1726776704.58557: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11436 1726776704.58605: stderr chunk (state=3): >>><<< 11436 1726776704.58611: stdout chunk (state=3): >>><<< 11436 1726776704.58630: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.12.75 closed. 11436 1726776704.58654: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11436 1726776704.58665: _low_level_execute_command(): starting 11436 1726776704.58674: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776704.3338428-11436-227307433901962/ > /dev/null 2>&1 && sleep 0' 11436 1726776704.61077: stderr chunk (state=2): >>><<< 11436 1726776704.61084: stdout chunk (state=2): >>><<< 11436 1726776704.61097: _low_level_execute_command() done: rc=0, stdout=, stderr= 11436 1726776704.61104: handler run complete 11436 1726776704.61117: attempt loop complete, returning result 11436 1726776704.61121: _execute() done 11436 1726776704.61124: dumping result to json 11436 1726776704.61130: done dumping result, returning 11436 1726776704.61137: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-cec2-986e-000000000ede] 11436 1726776704.61143: sending task result for task 120fa90a-8a95-cec2-986e-000000000ede 11436 1726776704.61172: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ede 11436 1726776704.61176: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8218 1726776704.61311: no more pending results, returning what we have 8218 1726776704.61315: results queue empty 8218 1726776704.61316: checking for any_errors_fatal 8218 1726776704.61327: done checking for any_errors_fatal 8218 1726776704.61328: checking for max_fail_percentage 8218 1726776704.61331: done checking for max_fail_percentage 8218 1726776704.61331: checking to see if all hosts have failed and the running result is not ok 8218 1726776704.61332: done checking to see if all hosts have failed 8218 1726776704.61333: getting the remaining hosts for this loop 8218 1726776704.61334: done getting the remaining hosts for this loop 8218 1726776704.61337: getting the next task for host managed_node2 8218 1726776704.61343: done getting next task for host managed_node2 8218 1726776704.61346: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8218 1726776704.61350: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776704.61360: getting variables 8218 1726776704.61361: in VariableManager get_vars() 8218 1726776704.61398: Calling all_inventory to load vars for managed_node2 8218 1726776704.61401: Calling groups_inventory to load vars for managed_node2 8218 1726776704.61403: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776704.61411: Calling all_plugins_play to load vars for managed_node2 8218 1726776704.61413: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776704.61415: Calling groups_plugins_play to load vars for managed_node2 8218 1726776704.61532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776704.61652: done with get_vars() 8218 1726776704.61661: done getting variables 8218 1726776704.61704: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:11:44 -0400 (0:00:00.322) 0:01:30.447 **** 8218 1726776704.61727: entering _queue_task() for managed_node2/set_fact 8218 1726776704.61889: worker is 1 (out of 1 available) 8218 1726776704.61904: exiting _queue_task() for managed_node2/set_fact 8218 1726776704.61916: done queuing things up, now waiting for results queue to drain 8218 1726776704.61918: waiting for pending results... 11444 1726776704.62042: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 11444 1726776704.62157: in run() - task 120fa90a-8a95-cec2-986e-000000000edf 11444 1726776704.62175: variable 'ansible_search_path' from source: unknown 11444 1726776704.62179: variable 'ansible_search_path' from source: unknown 11444 1726776704.62205: calling self._execute() 11444 1726776704.62616: variable 'ansible_host' from source: host vars for 'managed_node2' 11444 1726776704.62627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11444 1726776704.62638: variable 'omit' from source: magic vars 11444 1726776704.62708: variable 'omit' from source: magic vars 11444 1726776704.62751: variable 'omit' from source: magic vars 11444 1726776704.63017: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11444 1726776704.63027: variable '__cur_profile' from source: task vars 11444 1726776704.63133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11444 1726776704.64589: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11444 1726776704.64646: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11444 1726776704.64675: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11444 1726776704.64701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11444 1726776704.64721: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11444 1726776704.64776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11444 1726776704.64797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11444 1726776704.64815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11444 1726776704.64844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11444 1726776704.64856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11444 1726776704.64932: variable '__kernel_settings_tuned_current_profile' from source: set_fact 11444 1726776704.64974: variable 'omit' from source: magic vars 11444 1726776704.64995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11444 1726776704.65015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11444 1726776704.65033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11444 1726776704.65047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11444 1726776704.65057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11444 1726776704.65080: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11444 1726776704.65086: variable 'ansible_host' from source: host vars for 'managed_node2' 11444 1726776704.65091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11444 1726776704.65162: Set connection var ansible_connection to ssh 11444 1726776704.65170: Set connection var ansible_pipelining to False 11444 1726776704.65177: Set connection var ansible_timeout to 10 11444 1726776704.65185: Set connection var ansible_module_compression to ZIP_DEFLATED 11444 1726776704.65190: Set connection var ansible_shell_type to sh 11444 1726776704.65195: Set connection var ansible_shell_executable to /bin/sh 11444 1726776704.65212: variable 'ansible_shell_executable' from source: unknown 11444 1726776704.65215: variable 'ansible_connection' from source: unknown 11444 1726776704.65219: variable 'ansible_module_compression' from source: unknown 11444 1726776704.65222: variable 'ansible_shell_type' from source: unknown 11444 1726776704.65225: variable 'ansible_shell_executable' from source: unknown 11444 1726776704.65231: variable 'ansible_host' from source: host vars for 'managed_node2' 11444 1726776704.65235: variable 'ansible_pipelining' from source: unknown 11444 1726776704.65238: variable 'ansible_timeout' from source: unknown 11444 1726776704.65243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11444 1726776704.65305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11444 1726776704.65316: variable 'omit' from source: magic vars 11444 1726776704.65321: starting attempt loop 11444 1726776704.65325: running the handler 11444 1726776704.65335: handler run complete 11444 1726776704.65343: attempt loop complete, returning result 11444 1726776704.65346: _execute() done 11444 1726776704.65349: dumping result to json 11444 1726776704.65353: done dumping result, returning 11444 1726776704.65359: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-cec2-986e-000000000edf] 11444 1726776704.65365: sending task result for task 120fa90a-8a95-cec2-986e-000000000edf 11444 1726776704.65385: done sending task result for task 120fa90a-8a95-cec2-986e-000000000edf 11444 1726776704.65388: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8218 1726776704.65823: no more pending results, returning what we have 8218 1726776704.65825: results queue empty 8218 1726776704.65825: checking for any_errors_fatal 8218 1726776704.65830: done checking for any_errors_fatal 8218 1726776704.65830: checking for max_fail_percentage 8218 1726776704.65831: done checking for max_fail_percentage 8218 1726776704.65832: checking to see if all hosts have failed and the running result is not ok 8218 1726776704.65832: done checking to see if all hosts have failed 8218 1726776704.65833: getting the remaining hosts for this loop 8218 1726776704.65833: done getting the remaining hosts for this loop 8218 1726776704.65835: getting the next task for host managed_node2 8218 1726776704.65839: done getting next task for host managed_node2 8218 1726776704.65842: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8218 1726776704.65844: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776704.65855: getting variables 8218 1726776704.65856: in VariableManager get_vars() 8218 1726776704.65878: Calling all_inventory to load vars for managed_node2 8218 1726776704.65879: Calling groups_inventory to load vars for managed_node2 8218 1726776704.65881: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776704.65886: Calling all_plugins_play to load vars for managed_node2 8218 1726776704.65888: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776704.65890: Calling groups_plugins_play to load vars for managed_node2 8218 1726776704.65979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776704.66091: done with get_vars() 8218 1726776704.66098: done getting variables 8218 1726776704.66139: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:11:44 -0400 (0:00:00.044) 0:01:30.492 **** 8218 1726776704.66160: entering _queue_task() for managed_node2/copy 8218 1726776704.66320: worker is 1 (out of 1 available) 8218 1726776704.66336: exiting _queue_task() for managed_node2/copy 8218 1726776704.66348: done queuing things up, now waiting for results queue to drain 8218 1726776704.66350: waiting for pending results... 11445 1726776704.66480: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 11445 1726776704.66618: in run() - task 120fa90a-8a95-cec2-986e-000000000ee0 11445 1726776704.66636: variable 'ansible_search_path' from source: unknown 11445 1726776704.66640: variable 'ansible_search_path' from source: unknown 11445 1726776704.66667: calling self._execute() 11445 1726776704.66737: variable 'ansible_host' from source: host vars for 'managed_node2' 11445 1726776704.66745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11445 1726776704.66754: variable 'omit' from source: magic vars 11445 1726776704.66828: variable 'omit' from source: magic vars 11445 1726776704.66872: variable 'omit' from source: magic vars 11445 1726776704.66895: variable '__kernel_settings_active_profile' from source: set_fact 11445 1726776704.67110: variable '__kernel_settings_active_profile' from source: set_fact 11445 1726776704.67134: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11445 1726776704.67188: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11445 1726776704.67242: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11445 1726776704.67266: variable 'omit' from source: magic vars 11445 1726776704.67299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11445 1726776704.67324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11445 1726776704.67345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11445 1726776704.67359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11445 1726776704.67370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11445 1726776704.67392: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11445 1726776704.67398: variable 'ansible_host' from source: host vars for 'managed_node2' 11445 1726776704.67402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11445 1726776704.67471: Set connection var ansible_connection to ssh 11445 1726776704.67479: Set connection var ansible_pipelining to False 11445 1726776704.67485: Set connection var ansible_timeout to 10 11445 1726776704.67493: Set connection var ansible_module_compression to ZIP_DEFLATED 11445 1726776704.67498: Set connection var ansible_shell_type to sh 11445 1726776704.67503: Set connection var ansible_shell_executable to /bin/sh 11445 1726776704.67518: variable 'ansible_shell_executable' from source: unknown 11445 1726776704.67522: variable 'ansible_connection' from source: unknown 11445 1726776704.67525: variable 'ansible_module_compression' from source: unknown 11445 1726776704.67530: variable 'ansible_shell_type' from source: unknown 11445 1726776704.67533: variable 'ansible_shell_executable' from source: unknown 11445 1726776704.67537: variable 'ansible_host' from source: host vars for 'managed_node2' 11445 1726776704.67541: variable 'ansible_pipelining' from source: unknown 11445 1726776704.67544: variable 'ansible_timeout' from source: unknown 11445 1726776704.67548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11445 1726776704.67638: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11445 1726776704.67651: variable 'omit' from source: magic vars 11445 1726776704.67657: starting attempt loop 11445 1726776704.67661: running the handler 11445 1726776704.67671: _low_level_execute_command(): starting 11445 1726776704.67678: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11445 1726776704.70030: stdout chunk (state=2): >>>/root <<< 11445 1726776704.70149: stderr chunk (state=3): >>><<< 11445 1726776704.70156: stdout chunk (state=3): >>><<< 11445 1726776704.70174: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11445 1726776704.70187: _low_level_execute_command(): starting 11445 1726776704.70194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642 `" && echo ansible-tmp-1726776704.7018209-11445-140122967255642="` echo /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642 `" ) && sleep 0' 11445 1726776704.72752: stdout chunk (state=2): >>>ansible-tmp-1726776704.7018209-11445-140122967255642=/root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642 <<< 11445 1726776704.72877: stderr chunk (state=3): >>><<< 11445 1726776704.72883: stdout chunk (state=3): >>><<< 11445 1726776704.72897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776704.7018209-11445-140122967255642=/root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642 , stderr= 11445 1726776704.72966: variable 'ansible_module_compression' from source: unknown 11445 1726776704.73010: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11445 1726776704.73046: variable 'ansible_facts' from source: unknown 11445 1726776704.73111: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_stat.py 11445 1726776704.73193: Sending initial data 11445 1726776704.73201: Sent initial data (152 bytes) 11445 1726776704.75671: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpw5sor364 /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_stat.py <<< 11445 1726776704.76726: stderr chunk (state=3): >>><<< 11445 1726776704.76736: stdout chunk (state=3): >>><<< 11445 1726776704.76756: done transferring module to remote 11445 1726776704.76767: _low_level_execute_command(): starting 11445 1726776704.76773: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/ /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_stat.py && sleep 0' 11445 1726776704.79133: stderr chunk (state=2): >>><<< 11445 1726776704.79143: stdout chunk (state=2): >>><<< 11445 1726776704.79157: _low_level_execute_command() done: rc=0, stdout=, stderr= 11445 1726776704.79161: _low_level_execute_command(): starting 11445 1726776704.79167: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_stat.py && sleep 0' 11445 1726776704.95449: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776704.5736027, "mtime": 1726776696.492572, "ctime": 1726776696.492572, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11445 1726776704.96581: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11445 1726776704.96631: stderr chunk (state=3): >>><<< 11445 1726776704.96638: stdout chunk (state=3): >>><<< 11445 1726776704.96656: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776704.5736027, "mtime": 1726776696.492572, "ctime": 1726776696.492572, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11445 1726776704.96699: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11445 1726776704.96739: variable 'ansible_module_compression' from source: unknown 11445 1726776704.96773: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11445 1726776704.96794: variable 'ansible_facts' from source: unknown 11445 1726776704.96858: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_file.py 11445 1726776704.96946: Sending initial data 11445 1726776704.96954: Sent initial data (152 bytes) 11445 1726776704.99746: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpaay6t3wc /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_file.py <<< 11445 1726776705.00864: stderr chunk (state=3): >>><<< 11445 1726776705.00878: stdout chunk (state=3): >>><<< 11445 1726776705.00899: done transferring module to remote 11445 1726776705.00909: _low_level_execute_command(): starting 11445 1726776705.00914: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/ /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_file.py && sleep 0' 11445 1726776705.03323: stderr chunk (state=2): >>><<< 11445 1726776705.03334: stdout chunk (state=2): >>><<< 11445 1726776705.03349: _low_level_execute_command() done: rc=0, stdout=, stderr= 11445 1726776705.03354: _low_level_execute_command(): starting 11445 1726776705.03360: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/AnsiballZ_file.py && sleep 0' 11445 1726776705.19451: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpr59051z4", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11445 1726776705.20597: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11445 1726776705.20649: stderr chunk (state=3): >>><<< 11445 1726776705.20657: stdout chunk (state=3): >>><<< 11445 1726776705.20675: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpr59051z4", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11445 1726776705.20703: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpr59051z4', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11445 1726776705.20714: _low_level_execute_command(): starting 11445 1726776705.20720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776704.7018209-11445-140122967255642/ > /dev/null 2>&1 && sleep 0' 11445 1726776705.23146: stderr chunk (state=2): >>><<< 11445 1726776705.23157: stdout chunk (state=2): >>><<< 11445 1726776705.23175: _low_level_execute_command() done: rc=0, stdout=, stderr= 11445 1726776705.23185: handler run complete 11445 1726776705.23205: attempt loop complete, returning result 11445 1726776705.23209: _execute() done 11445 1726776705.23212: dumping result to json 11445 1726776705.23218: done dumping result, returning 11445 1726776705.23224: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-cec2-986e-000000000ee0] 11445 1726776705.23232: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee0 11445 1726776705.23267: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee0 11445 1726776705.23271: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8218 1726776705.23441: no more pending results, returning what we have 8218 1726776705.23444: results queue empty 8218 1726776705.23445: checking for any_errors_fatal 8218 1726776705.23454: done checking for any_errors_fatal 8218 1726776705.23454: checking for max_fail_percentage 8218 1726776705.23456: done checking for max_fail_percentage 8218 1726776705.23457: checking to see if all hosts have failed and the running result is not ok 8218 1726776705.23457: done checking to see if all hosts have failed 8218 1726776705.23458: getting the remaining hosts for this loop 8218 1726776705.23459: done getting the remaining hosts for this loop 8218 1726776705.23462: getting the next task for host managed_node2 8218 1726776705.23471: done getting next task for host managed_node2 8218 1726776705.23474: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8218 1726776705.23477: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776705.23487: getting variables 8218 1726776705.23488: in VariableManager get_vars() 8218 1726776705.23520: Calling all_inventory to load vars for managed_node2 8218 1726776705.23523: Calling groups_inventory to load vars for managed_node2 8218 1726776705.23524: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776705.23534: Calling all_plugins_play to load vars for managed_node2 8218 1726776705.23537: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776705.23539: Calling groups_plugins_play to load vars for managed_node2 8218 1726776705.23656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776705.23815: done with get_vars() 8218 1726776705.23823: done getting variables 8218 1726776705.23870: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:11:45 -0400 (0:00:00.577) 0:01:31.069 **** 8218 1726776705.23895: entering _queue_task() for managed_node2/copy 8218 1726776705.24077: worker is 1 (out of 1 available) 8218 1726776705.24092: exiting _queue_task() for managed_node2/copy 8218 1726776705.24104: done queuing things up, now waiting for results queue to drain 8218 1726776705.24107: waiting for pending results... 11460 1726776705.24239: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 11460 1726776705.24382: in run() - task 120fa90a-8a95-cec2-986e-000000000ee1 11460 1726776705.24399: variable 'ansible_search_path' from source: unknown 11460 1726776705.24403: variable 'ansible_search_path' from source: unknown 11460 1726776705.24433: calling self._execute() 11460 1726776705.24504: variable 'ansible_host' from source: host vars for 'managed_node2' 11460 1726776705.24514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11460 1726776705.24523: variable 'omit' from source: magic vars 11460 1726776705.24601: variable 'omit' from source: magic vars 11460 1726776705.24645: variable 'omit' from source: magic vars 11460 1726776705.24670: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11460 1726776705.24894: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11460 1726776705.24959: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11460 1726776705.24991: variable 'omit' from source: magic vars 11460 1726776705.25028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11460 1726776705.25056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11460 1726776705.25077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11460 1726776705.25092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11460 1726776705.25104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11460 1726776705.25132: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11460 1726776705.25137: variable 'ansible_host' from source: host vars for 'managed_node2' 11460 1726776705.25142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11460 1726776705.25209: Set connection var ansible_connection to ssh 11460 1726776705.25218: Set connection var ansible_pipelining to False 11460 1726776705.25224: Set connection var ansible_timeout to 10 11460 1726776705.25234: Set connection var ansible_module_compression to ZIP_DEFLATED 11460 1726776705.25239: Set connection var ansible_shell_type to sh 11460 1726776705.25244: Set connection var ansible_shell_executable to /bin/sh 11460 1726776705.25262: variable 'ansible_shell_executable' from source: unknown 11460 1726776705.25266: variable 'ansible_connection' from source: unknown 11460 1726776705.25269: variable 'ansible_module_compression' from source: unknown 11460 1726776705.25272: variable 'ansible_shell_type' from source: unknown 11460 1726776705.25275: variable 'ansible_shell_executable' from source: unknown 11460 1726776705.25278: variable 'ansible_host' from source: host vars for 'managed_node2' 11460 1726776705.25283: variable 'ansible_pipelining' from source: unknown 11460 1726776705.25286: variable 'ansible_timeout' from source: unknown 11460 1726776705.25290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11460 1726776705.25383: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11460 1726776705.25395: variable 'omit' from source: magic vars 11460 1726776705.25402: starting attempt loop 11460 1726776705.25405: running the handler 11460 1726776705.25416: _low_level_execute_command(): starting 11460 1726776705.25425: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11460 1726776705.27742: stdout chunk (state=2): >>>/root <<< 11460 1726776705.27865: stderr chunk (state=3): >>><<< 11460 1726776705.27876: stdout chunk (state=3): >>><<< 11460 1726776705.27896: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11460 1726776705.27911: _low_level_execute_command(): starting 11460 1726776705.27917: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438 `" && echo ansible-tmp-1726776705.2790563-11460-4615027317438="` echo /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438 `" ) && sleep 0' 11460 1726776705.30486: stdout chunk (state=2): >>>ansible-tmp-1726776705.2790563-11460-4615027317438=/root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438 <<< 11460 1726776705.30614: stderr chunk (state=3): >>><<< 11460 1726776705.30623: stdout chunk (state=3): >>><<< 11460 1726776705.30641: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776705.2790563-11460-4615027317438=/root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438 , stderr= 11460 1726776705.30715: variable 'ansible_module_compression' from source: unknown 11460 1726776705.30763: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11460 1726776705.30801: variable 'ansible_facts' from source: unknown 11460 1726776705.30872: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_stat.py 11460 1726776705.30960: Sending initial data 11460 1726776705.30970: Sent initial data (150 bytes) 11460 1726776705.33462: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp3lrbtc1l /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_stat.py <<< 11460 1726776705.34548: stderr chunk (state=3): >>><<< 11460 1726776705.34558: stdout chunk (state=3): >>><<< 11460 1726776705.34578: done transferring module to remote 11460 1726776705.34589: _low_level_execute_command(): starting 11460 1726776705.34595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/ /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_stat.py && sleep 0' 11460 1726776705.36982: stderr chunk (state=2): >>><<< 11460 1726776705.36993: stdout chunk (state=2): >>><<< 11460 1726776705.37006: _low_level_execute_command() done: rc=0, stdout=, stderr= 11460 1726776705.37010: _low_level_execute_command(): starting 11460 1726776705.37014: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_stat.py && sleep 0' 11460 1726776705.53211: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776694.6935651, "mtime": 1726776696.493572, "ctime": 1726776696.493572, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11460 1726776705.54345: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11460 1726776705.54396: stderr chunk (state=3): >>><<< 11460 1726776705.54403: stdout chunk (state=3): >>><<< 11460 1726776705.54423: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776694.6935651, "mtime": 1726776696.493572, "ctime": 1726776696.493572, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11460 1726776705.54471: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11460 1726776705.54510: variable 'ansible_module_compression' from source: unknown 11460 1726776705.54547: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11460 1726776705.54570: variable 'ansible_facts' from source: unknown 11460 1726776705.54634: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_file.py 11460 1726776705.54724: Sending initial data 11460 1726776705.54733: Sent initial data (150 bytes) 11460 1726776705.57285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpc8syohmc /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_file.py <<< 11460 1726776705.58398: stderr chunk (state=3): >>><<< 11460 1726776705.58408: stdout chunk (state=3): >>><<< 11460 1726776705.58430: done transferring module to remote 11460 1726776705.58440: _low_level_execute_command(): starting 11460 1726776705.58445: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/ /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_file.py && sleep 0' 11460 1726776705.60809: stderr chunk (state=2): >>><<< 11460 1726776705.60817: stdout chunk (state=2): >>><<< 11460 1726776705.60832: _low_level_execute_command() done: rc=0, stdout=, stderr= 11460 1726776705.60837: _low_level_execute_command(): starting 11460 1726776705.60842: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/AnsiballZ_file.py && sleep 0' 11460 1726776705.76882: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp1hhwc3w8", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11460 1726776705.77976: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11460 1726776705.78023: stderr chunk (state=3): >>><<< 11460 1726776705.78031: stdout chunk (state=3): >>><<< 11460 1726776705.78048: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp1hhwc3w8", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11460 1726776705.78078: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmp1hhwc3w8', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11460 1726776705.78090: _low_level_execute_command(): starting 11460 1726776705.78095: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776705.2790563-11460-4615027317438/ > /dev/null 2>&1 && sleep 0' 11460 1726776705.80484: stderr chunk (state=2): >>><<< 11460 1726776705.80494: stdout chunk (state=2): >>><<< 11460 1726776705.80510: _low_level_execute_command() done: rc=0, stdout=, stderr= 11460 1726776705.80519: handler run complete 11460 1726776705.80542: attempt loop complete, returning result 11460 1726776705.80547: _execute() done 11460 1726776705.80550: dumping result to json 11460 1726776705.80555: done dumping result, returning 11460 1726776705.80563: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-cec2-986e-000000000ee1] 11460 1726776705.80569: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee1 11460 1726776705.80603: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee1 11460 1726776705.80607: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8218 1726776705.80779: no more pending results, returning what we have 8218 1726776705.80783: results queue empty 8218 1726776705.80783: checking for any_errors_fatal 8218 1726776705.80790: done checking for any_errors_fatal 8218 1726776705.80790: checking for max_fail_percentage 8218 1726776705.80792: done checking for max_fail_percentage 8218 1726776705.80793: checking to see if all hosts have failed and the running result is not ok 8218 1726776705.80793: done checking to see if all hosts have failed 8218 1726776705.80794: getting the remaining hosts for this loop 8218 1726776705.80795: done getting the remaining hosts for this loop 8218 1726776705.80798: getting the next task for host managed_node2 8218 1726776705.80804: done getting next task for host managed_node2 8218 1726776705.80807: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8218 1726776705.80810: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776705.80821: getting variables 8218 1726776705.80822: in VariableManager get_vars() 8218 1726776705.80856: Calling all_inventory to load vars for managed_node2 8218 1726776705.80859: Calling groups_inventory to load vars for managed_node2 8218 1726776705.80861: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776705.80871: Calling all_plugins_play to load vars for managed_node2 8218 1726776705.80874: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776705.80876: Calling groups_plugins_play to load vars for managed_node2 8218 1726776705.80990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776705.81119: done with get_vars() 8218 1726776705.81131: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:11:45 -0400 (0:00:00.573) 0:01:31.642 **** 8218 1726776705.81199: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776705.81375: worker is 1 (out of 1 available) 8218 1726776705.81390: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8218 1726776705.81402: done queuing things up, now waiting for results queue to drain 8218 1726776705.81404: waiting for pending results... 11472 1726776705.81537: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 11472 1726776705.81672: in run() - task 120fa90a-8a95-cec2-986e-000000000ee2 11472 1726776705.81688: variable 'ansible_search_path' from source: unknown 11472 1726776705.81692: variable 'ansible_search_path' from source: unknown 11472 1726776705.81720: calling self._execute() 11472 1726776705.81792: variable 'ansible_host' from source: host vars for 'managed_node2' 11472 1726776705.81801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11472 1726776705.81809: variable 'omit' from source: magic vars 11472 1726776705.81890: variable 'omit' from source: magic vars 11472 1726776705.81931: variable 'omit' from source: magic vars 11472 1726776705.81953: variable '__kernel_settings_profile_filename' from source: role '' all vars 11472 1726776705.82175: variable '__kernel_settings_profile_filename' from source: role '' all vars 11472 1726776705.82239: variable '__kernel_settings_profile_dir' from source: role '' all vars 11472 1726776705.82304: variable '__kernel_settings_profile_parent' from source: set_fact 11472 1726776705.82313: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11472 1726776705.82405: variable 'omit' from source: magic vars 11472 1726776705.82439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11472 1726776705.82465: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11472 1726776705.82485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11472 1726776705.82500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11472 1726776705.82511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11472 1726776705.82535: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11472 1726776705.82540: variable 'ansible_host' from source: host vars for 'managed_node2' 11472 1726776705.82545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11472 1726776705.82611: Set connection var ansible_connection to ssh 11472 1726776705.82619: Set connection var ansible_pipelining to False 11472 1726776705.82627: Set connection var ansible_timeout to 10 11472 1726776705.82636: Set connection var ansible_module_compression to ZIP_DEFLATED 11472 1726776705.82641: Set connection var ansible_shell_type to sh 11472 1726776705.82647: Set connection var ansible_shell_executable to /bin/sh 11472 1726776705.82662: variable 'ansible_shell_executable' from source: unknown 11472 1726776705.82666: variable 'ansible_connection' from source: unknown 11472 1726776705.82669: variable 'ansible_module_compression' from source: unknown 11472 1726776705.82672: variable 'ansible_shell_type' from source: unknown 11472 1726776705.82675: variable 'ansible_shell_executable' from source: unknown 11472 1726776705.82678: variable 'ansible_host' from source: host vars for 'managed_node2' 11472 1726776705.82684: variable 'ansible_pipelining' from source: unknown 11472 1726776705.82687: variable 'ansible_timeout' from source: unknown 11472 1726776705.82690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11472 1726776705.82814: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11472 1726776705.82826: variable 'omit' from source: magic vars 11472 1726776705.82833: starting attempt loop 11472 1726776705.82837: running the handler 11472 1726776705.82848: _low_level_execute_command(): starting 11472 1726776705.82855: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11472 1726776705.85182: stdout chunk (state=2): >>>/root <<< 11472 1726776705.85305: stderr chunk (state=3): >>><<< 11472 1726776705.85313: stdout chunk (state=3): >>><<< 11472 1726776705.85334: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11472 1726776705.85348: _low_level_execute_command(): starting 11472 1726776705.85355: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273 `" && echo ansible-tmp-1726776705.8534272-11472-59820471091273="` echo /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273 `" ) && sleep 0' 11472 1726776705.87943: stdout chunk (state=2): >>>ansible-tmp-1726776705.8534272-11472-59820471091273=/root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273 <<< 11472 1726776705.88079: stderr chunk (state=3): >>><<< 11472 1726776705.88087: stdout chunk (state=3): >>><<< 11472 1726776705.88104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776705.8534272-11472-59820471091273=/root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273 , stderr= 11472 1726776705.88146: variable 'ansible_module_compression' from source: unknown 11472 1726776705.88182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11472 1726776705.88218: variable 'ansible_facts' from source: unknown 11472 1726776705.88278: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273/AnsiballZ_kernel_settings_get_config.py 11472 1726776705.88381: Sending initial data 11472 1726776705.88388: Sent initial data (173 bytes) 11472 1726776705.90904: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpdq2ztxhq /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273/AnsiballZ_kernel_settings_get_config.py <<< 11472 1726776705.91930: stderr chunk (state=3): >>><<< 11472 1726776705.91936: stdout chunk (state=3): >>><<< 11472 1726776705.91956: done transferring module to remote 11472 1726776705.91969: _low_level_execute_command(): starting 11472 1726776705.91975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273/ /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11472 1726776705.94330: stderr chunk (state=2): >>><<< 11472 1726776705.94338: stdout chunk (state=2): >>><<< 11472 1726776705.94352: _low_level_execute_command() done: rc=0, stdout=, stderr= 11472 1726776705.94356: _low_level_execute_command(): starting 11472 1726776705.94362: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11472 1726776706.10227: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "vm": {"transparent_hugepages": "never"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 11472 1726776706.11284: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11472 1726776706.11334: stderr chunk (state=3): >>><<< 11472 1726776706.11340: stdout chunk (state=3): >>><<< 11472 1726776706.11358: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "vm": {"transparent_hugepages": "never"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.12.75 closed. 11472 1726776706.11383: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11472 1726776706.11395: _low_level_execute_command(): starting 11472 1726776706.11400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776705.8534272-11472-59820471091273/ > /dev/null 2>&1 && sleep 0' 11472 1726776706.13797: stderr chunk (state=2): >>><<< 11472 1726776706.13808: stdout chunk (state=2): >>><<< 11472 1726776706.13822: _low_level_execute_command() done: rc=0, stdout=, stderr= 11472 1726776706.13831: handler run complete 11472 1726776706.13846: attempt loop complete, returning result 11472 1726776706.13849: _execute() done 11472 1726776706.13852: dumping result to json 11472 1726776706.13857: done dumping result, returning 11472 1726776706.13864: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-cec2-986e-000000000ee2] 11472 1726776706.13874: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee2 11472 1726776706.13901: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee2 11472 1726776706.13905: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "vm": { "transparent_hugepages": "never" } } } 8218 1726776706.14046: no more pending results, returning what we have 8218 1726776706.14049: results queue empty 8218 1726776706.14050: checking for any_errors_fatal 8218 1726776706.14057: done checking for any_errors_fatal 8218 1726776706.14058: checking for max_fail_percentage 8218 1726776706.14060: done checking for max_fail_percentage 8218 1726776706.14060: checking to see if all hosts have failed and the running result is not ok 8218 1726776706.14061: done checking to see if all hosts have failed 8218 1726776706.14062: getting the remaining hosts for this loop 8218 1726776706.14063: done getting the remaining hosts for this loop 8218 1726776706.14066: getting the next task for host managed_node2 8218 1726776706.14072: done getting next task for host managed_node2 8218 1726776706.14075: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8218 1726776706.14078: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776706.14089: getting variables 8218 1726776706.14090: in VariableManager get_vars() 8218 1726776706.14123: Calling all_inventory to load vars for managed_node2 8218 1726776706.14126: Calling groups_inventory to load vars for managed_node2 8218 1726776706.14128: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776706.14137: Calling all_plugins_play to load vars for managed_node2 8218 1726776706.14140: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776706.14142: Calling groups_plugins_play to load vars for managed_node2 8218 1726776706.14306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776706.14424: done with get_vars() 8218 1726776706.14434: done getting variables 8218 1726776706.14477: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:11:46 -0400 (0:00:00.332) 0:01:31.975 **** 8218 1726776706.14501: entering _queue_task() for managed_node2/template 8218 1726776706.14664: worker is 1 (out of 1 available) 8218 1726776706.14678: exiting _queue_task() for managed_node2/template 8218 1726776706.14691: done queuing things up, now waiting for results queue to drain 8218 1726776706.14693: waiting for pending results... 11483 1726776706.14824: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 11483 1726776706.14953: in run() - task 120fa90a-8a95-cec2-986e-000000000ee3 11483 1726776706.14972: variable 'ansible_search_path' from source: unknown 11483 1726776706.14976: variable 'ansible_search_path' from source: unknown 11483 1726776706.15003: calling self._execute() 11483 1726776706.15076: variable 'ansible_host' from source: host vars for 'managed_node2' 11483 1726776706.15084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11483 1726776706.15094: variable 'omit' from source: magic vars 11483 1726776706.15174: variable 'omit' from source: magic vars 11483 1726776706.15218: variable 'omit' from source: magic vars 11483 1726776706.15464: variable '__kernel_settings_profile_src' from source: role '' all vars 11483 1726776706.15476: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11483 1726776706.15533: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11483 1726776706.15552: variable '__kernel_settings_profile_filename' from source: role '' all vars 11483 1726776706.15601: variable '__kernel_settings_profile_filename' from source: role '' all vars 11483 1726776706.15651: variable '__kernel_settings_profile_dir' from source: role '' all vars 11483 1726776706.15716: variable '__kernel_settings_profile_parent' from source: set_fact 11483 1726776706.15724: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11483 1726776706.15750: variable 'omit' from source: magic vars 11483 1726776706.15785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11483 1726776706.15811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11483 1726776706.15834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11483 1726776706.15848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11483 1726776706.15859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11483 1726776706.15883: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11483 1726776706.15889: variable 'ansible_host' from source: host vars for 'managed_node2' 11483 1726776706.15893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11483 1726776706.15962: Set connection var ansible_connection to ssh 11483 1726776706.15972: Set connection var ansible_pipelining to False 11483 1726776706.15979: Set connection var ansible_timeout to 10 11483 1726776706.15987: Set connection var ansible_module_compression to ZIP_DEFLATED 11483 1726776706.15992: Set connection var ansible_shell_type to sh 11483 1726776706.15997: Set connection var ansible_shell_executable to /bin/sh 11483 1726776706.16013: variable 'ansible_shell_executable' from source: unknown 11483 1726776706.16017: variable 'ansible_connection' from source: unknown 11483 1726776706.16021: variable 'ansible_module_compression' from source: unknown 11483 1726776706.16024: variable 'ansible_shell_type' from source: unknown 11483 1726776706.16027: variable 'ansible_shell_executable' from source: unknown 11483 1726776706.16032: variable 'ansible_host' from source: host vars for 'managed_node2' 11483 1726776706.16037: variable 'ansible_pipelining' from source: unknown 11483 1726776706.16040: variable 'ansible_timeout' from source: unknown 11483 1726776706.16045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11483 1726776706.16136: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11483 1726776706.16147: variable 'omit' from source: magic vars 11483 1726776706.16153: starting attempt loop 11483 1726776706.16156: running the handler 11483 1726776706.16170: _low_level_execute_command(): starting 11483 1726776706.16177: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11483 1726776706.18511: stdout chunk (state=2): >>>/root <<< 11483 1726776706.18632: stderr chunk (state=3): >>><<< 11483 1726776706.18638: stdout chunk (state=3): >>><<< 11483 1726776706.18655: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11483 1726776706.18669: _low_level_execute_command(): starting 11483 1726776706.18675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105 `" && echo ansible-tmp-1726776706.1866245-11483-30167084254105="` echo /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105 `" ) && sleep 0' 11483 1726776706.21231: stdout chunk (state=2): >>>ansible-tmp-1726776706.1866245-11483-30167084254105=/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105 <<< 11483 1726776706.21360: stderr chunk (state=3): >>><<< 11483 1726776706.21369: stdout chunk (state=3): >>><<< 11483 1726776706.21385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776706.1866245-11483-30167084254105=/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105 , stderr= 11483 1726776706.21400: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 11483 1726776706.21417: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 11483 1726776706.21440: variable 'ansible_search_path' from source: unknown 11483 1726776706.21987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11483 1726776706.23447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11483 1726776706.23492: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11483 1726776706.23521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11483 1726776706.23549: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11483 1726776706.23582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11483 1726776706.23760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11483 1726776706.23782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11483 1726776706.23803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11483 1726776706.23832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11483 1726776706.23844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11483 1726776706.24061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11483 1726776706.24080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11483 1726776706.24100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11483 1726776706.24125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11483 1726776706.24141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11483 1726776706.24372: variable 'ansible_managed' from source: unknown 11483 1726776706.24380: variable '__sections' from source: task vars 11483 1726776706.24465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11483 1726776706.24483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11483 1726776706.24500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11483 1726776706.24525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11483 1726776706.24538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11483 1726776706.24605: variable 'kernel_settings_sysctl' from source: include params 11483 1726776706.24612: variable '__kernel_settings_state_empty' from source: role '' all vars 11483 1726776706.24619: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11483 1726776706.24650: variable '__sysctl_old' from source: task vars 11483 1726776706.24694: variable '__sysctl_old' from source: task vars 11483 1726776706.24827: variable 'kernel_settings_purge' from source: include params 11483 1726776706.24836: variable 'kernel_settings_sysctl' from source: include params 11483 1726776706.24841: variable '__kernel_settings_state_empty' from source: role '' all vars 11483 1726776706.24846: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11483 1726776706.24851: variable '__kernel_settings_profile_contents' from source: set_fact 11483 1726776706.24975: variable 'kernel_settings_sysfs' from source: include params 11483 1726776706.24982: variable '__kernel_settings_state_empty' from source: role '' all vars 11483 1726776706.24988: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11483 1726776706.25001: variable '__sysfs_old' from source: task vars 11483 1726776706.25044: variable '__sysfs_old' from source: task vars 11483 1726776706.25177: variable 'kernel_settings_purge' from source: include params 11483 1726776706.25184: variable 'kernel_settings_sysfs' from source: include params 11483 1726776706.25189: variable '__kernel_settings_state_empty' from source: role '' all vars 11483 1726776706.25195: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11483 1726776706.25199: variable '__kernel_settings_profile_contents' from source: set_fact 11483 1726776706.25214: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 11483 1726776706.25222: variable '__systemd_old' from source: task vars 11483 1726776706.25263: variable '__systemd_old' from source: task vars 11483 1726776706.25387: variable 'kernel_settings_purge' from source: include params 11483 1726776706.25394: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 11483 1726776706.25399: variable '__kernel_settings_state_absent' from source: role '' all vars 11483 1726776706.25405: variable '__kernel_settings_profile_contents' from source: set_fact 11483 1726776706.25419: variable 'kernel_settings_transparent_hugepages' from source: include params 11483 1726776706.25425: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 11483 1726776706.25432: variable '__trans_huge_old' from source: task vars 11483 1726776706.25472: variable '__trans_huge_old' from source: task vars 11483 1726776706.25596: variable 'kernel_settings_purge' from source: include params 11483 1726776706.25603: variable 'kernel_settings_transparent_hugepages' from source: include params 11483 1726776706.25608: variable '__kernel_settings_state_absent' from source: role '' all vars 11483 1726776706.25614: variable '__kernel_settings_profile_contents' from source: set_fact 11483 1726776706.25623: variable '__trans_defrag_old' from source: task vars 11483 1726776706.25665: variable '__trans_defrag_old' from source: task vars 11483 1726776706.25790: variable 'kernel_settings_purge' from source: include params 11483 1726776706.25796: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 11483 1726776706.25801: variable '__kernel_settings_state_absent' from source: role '' all vars 11483 1726776706.25807: variable '__kernel_settings_profile_contents' from source: set_fact 11483 1726776706.25824: variable '__kernel_settings_state_absent' from source: role '' all vars 11483 1726776706.25836: variable '__kernel_settings_state_absent' from source: role '' all vars 11483 1726776706.25842: variable '__kernel_settings_state_absent' from source: role '' all vars 11483 1726776706.26444: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11483 1726776706.26484: variable 'ansible_module_compression' from source: unknown 11483 1726776706.26522: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11483 1726776706.26555: variable 'ansible_facts' from source: unknown 11483 1726776706.26622: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_stat.py 11483 1726776706.26710: Sending initial data 11483 1726776706.26717: Sent initial data (151 bytes) 11483 1726776706.29296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmplqcsi_gl /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_stat.py <<< 11483 1726776706.30370: stderr chunk (state=3): >>><<< 11483 1726776706.30380: stdout chunk (state=3): >>><<< 11483 1726776706.30399: done transferring module to remote 11483 1726776706.30410: _low_level_execute_command(): starting 11483 1726776706.30416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/ /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_stat.py && sleep 0' 11483 1726776706.32822: stderr chunk (state=2): >>><<< 11483 1726776706.32831: stdout chunk (state=2): >>><<< 11483 1726776706.32845: _low_level_execute_command() done: rc=0, stdout=, stderr= 11483 1726776706.32850: _low_level_execute_command(): starting 11483 1726776706.32855: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_stat.py && sleep 0' 11483 1726776706.49186: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 121, "inode": 224395467, "dev": 51713, "nlink": 1, "atime": 1726776696.477572, "mtime": 1726776695.712569, "ctime": 1726776695.95857, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "mimetype": "text/plain", "charset": "us-ascii", "version": "3435563319", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11483 1726776706.50395: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11483 1726776706.50444: stderr chunk (state=3): >>><<< 11483 1726776706.50453: stdout chunk (state=3): >>><<< 11483 1726776706.50469: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 121, "inode": 224395467, "dev": 51713, "nlink": 1, "atime": 1726776696.477572, "mtime": 1726776695.712569, "ctime": 1726776695.95857, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "mimetype": "text/plain", "charset": "us-ascii", "version": "3435563319", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11483 1726776706.50531: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11483 1726776706.50616: Sending initial data 11483 1726776706.50624: Sent initial data (159 bytes) 11483 1726776706.53174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpdue0hndl/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source <<< 11483 1726776706.53514: stderr chunk (state=3): >>><<< 11483 1726776706.53521: stdout chunk (state=3): >>><<< 11483 1726776706.53539: _low_level_execute_command(): starting 11483 1726776706.53545: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/ /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source && sleep 0' 11483 1726776706.55878: stderr chunk (state=2): >>><<< 11483 1726776706.55886: stdout chunk (state=2): >>><<< 11483 1726776706.55899: _low_level_execute_command() done: rc=0, stdout=, stderr= 11483 1726776706.55919: variable 'ansible_module_compression' from source: unknown 11483 1726776706.55954: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11483 1726776706.55973: variable 'ansible_facts' from source: unknown 11483 1726776706.56034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_copy.py 11483 1726776706.56116: Sending initial data 11483 1726776706.56123: Sent initial data (151 bytes) 11483 1726776706.58594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp_60_134_ /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_copy.py <<< 11483 1726776706.59664: stderr chunk (state=3): >>><<< 11483 1726776706.59673: stdout chunk (state=3): >>><<< 11483 1726776706.59691: done transferring module to remote 11483 1726776706.59700: _low_level_execute_command(): starting 11483 1726776706.59706: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/ /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_copy.py && sleep 0' 11483 1726776706.62007: stderr chunk (state=2): >>><<< 11483 1726776706.62014: stdout chunk (state=2): >>><<< 11483 1726776706.62027: _low_level_execute_command() done: rc=0, stdout=, stderr= 11483 1726776706.62032: _low_level_execute_command(): starting 11483 1726776706.62038: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/AnsiballZ_copy.py && sleep 0' 11483 1726776706.78624: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11483 1726776706.79830: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11483 1726776706.79878: stderr chunk (state=3): >>><<< 11483 1726776706.79886: stdout chunk (state=3): >>><<< 11483 1726776706.79902: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11483 1726776706.79931: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11483 1726776706.79962: _low_level_execute_command(): starting 11483 1726776706.79972: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/ > /dev/null 2>&1 && sleep 0' 11483 1726776706.82395: stderr chunk (state=2): >>><<< 11483 1726776706.82402: stdout chunk (state=2): >>><<< 11483 1726776706.82417: _low_level_execute_command() done: rc=0, stdout=, stderr= 11483 1726776706.82427: handler run complete 11483 1726776706.82449: attempt loop complete, returning result 11483 1726776706.82452: _execute() done 11483 1726776706.82456: dumping result to json 11483 1726776706.82461: done dumping result, returning 11483 1726776706.82470: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-cec2-986e-000000000ee3] 11483 1726776706.82477: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee3 11483 1726776706.82521: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee3 11483 1726776706.82525: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "7d83891795eeb6debeff7e2812501630", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "src": "/root/.ansible/tmp/ansible-tmp-1726776706.1866245-11483-30167084254105/source", "state": "file", "uid": 0 } 8218 1726776706.82710: no more pending results, returning what we have 8218 1726776706.82713: results queue empty 8218 1726776706.82714: checking for any_errors_fatal 8218 1726776706.82720: done checking for any_errors_fatal 8218 1726776706.82721: checking for max_fail_percentage 8218 1726776706.82722: done checking for max_fail_percentage 8218 1726776706.82723: checking to see if all hosts have failed and the running result is not ok 8218 1726776706.82723: done checking to see if all hosts have failed 8218 1726776706.82724: getting the remaining hosts for this loop 8218 1726776706.82725: done getting the remaining hosts for this loop 8218 1726776706.82730: getting the next task for host managed_node2 8218 1726776706.82736: done getting next task for host managed_node2 8218 1726776706.82739: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8218 1726776706.82742: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776706.82752: getting variables 8218 1726776706.82753: in VariableManager get_vars() 8218 1726776706.82786: Calling all_inventory to load vars for managed_node2 8218 1726776706.82789: Calling groups_inventory to load vars for managed_node2 8218 1726776706.82790: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776706.82798: Calling all_plugins_play to load vars for managed_node2 8218 1726776706.82800: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776706.82802: Calling groups_plugins_play to load vars for managed_node2 8218 1726776706.82912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776706.83037: done with get_vars() 8218 1726776706.83046: done getting variables 8218 1726776706.83088: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:11:46 -0400 (0:00:00.686) 0:01:32.661 **** 8218 1726776706.83116: entering _queue_task() for managed_node2/service 8218 1726776706.83272: worker is 1 (out of 1 available) 8218 1726776706.83286: exiting _queue_task() for managed_node2/service 8218 1726776706.83298: done queuing things up, now waiting for results queue to drain 8218 1726776706.83299: waiting for pending results... 11498 1726776706.83434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 11498 1726776706.83563: in run() - task 120fa90a-8a95-cec2-986e-000000000ee4 11498 1726776706.83581: variable 'ansible_search_path' from source: unknown 11498 1726776706.83585: variable 'ansible_search_path' from source: unknown 11498 1726776706.83619: variable '__kernel_settings_services' from source: include_vars 11498 1726776706.83864: variable '__kernel_settings_services' from source: include_vars 11498 1726776706.84000: variable 'omit' from source: magic vars 11498 1726776706.84075: variable 'ansible_host' from source: host vars for 'managed_node2' 11498 1726776706.84086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11498 1726776706.84095: variable 'omit' from source: magic vars 11498 1726776706.84270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11498 1726776706.84437: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11498 1726776706.84472: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11498 1726776706.84497: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11498 1726776706.84524: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11498 1726776706.84603: variable '__kernel_settings_register_profile' from source: set_fact 11498 1726776706.84612: variable '__kernel_settings_register_mode' from source: set_fact 11498 1726776706.84627: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 11498 1726776706.84633: when evaluation is False, skipping this task 11498 1726776706.84654: variable 'item' from source: unknown 11498 1726776706.84702: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 11498 1726776706.84731: dumping result to json 11498 1726776706.84737: done dumping result, returning 11498 1726776706.84743: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-cec2-986e-000000000ee4] 11498 1726776706.84749: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee4 11498 1726776706.84775: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee4 11498 1726776706.84779: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8218 1726776706.84925: no more pending results, returning what we have 8218 1726776706.84928: results queue empty 8218 1726776706.84931: checking for any_errors_fatal 8218 1726776706.84943: done checking for any_errors_fatal 8218 1726776706.84944: checking for max_fail_percentage 8218 1726776706.84945: done checking for max_fail_percentage 8218 1726776706.84946: checking to see if all hosts have failed and the running result is not ok 8218 1726776706.84946: done checking to see if all hosts have failed 8218 1726776706.84947: getting the remaining hosts for this loop 8218 1726776706.84948: done getting the remaining hosts for this loop 8218 1726776706.84950: getting the next task for host managed_node2 8218 1726776706.84956: done getting next task for host managed_node2 8218 1726776706.84959: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8218 1726776706.84962: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776706.84974: getting variables 8218 1726776706.84975: in VariableManager get_vars() 8218 1726776706.84997: Calling all_inventory to load vars for managed_node2 8218 1726776706.84999: Calling groups_inventory to load vars for managed_node2 8218 1726776706.85000: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776706.85006: Calling all_plugins_play to load vars for managed_node2 8218 1726776706.85007: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776706.85009: Calling groups_plugins_play to load vars for managed_node2 8218 1726776706.85108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776706.85230: done with get_vars() 8218 1726776706.85239: done getting variables 8218 1726776706.85280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:11:46 -0400 (0:00:00.021) 0:01:32.683 **** 8218 1726776706.85302: entering _queue_task() for managed_node2/command 8218 1726776706.85445: worker is 1 (out of 1 available) 8218 1726776706.85458: exiting _queue_task() for managed_node2/command 8218 1726776706.85472: done queuing things up, now waiting for results queue to drain 8218 1726776706.85473: waiting for pending results... 11499 1726776706.85589: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 11499 1726776706.85692: in run() - task 120fa90a-8a95-cec2-986e-000000000ee5 11499 1726776706.85709: variable 'ansible_search_path' from source: unknown 11499 1726776706.85713: variable 'ansible_search_path' from source: unknown 11499 1726776706.85740: calling self._execute() 11499 1726776706.85805: variable 'ansible_host' from source: host vars for 'managed_node2' 11499 1726776706.85814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11499 1726776706.85824: variable 'omit' from source: magic vars 11499 1726776706.86137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11499 1726776706.86399: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11499 1726776706.86438: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11499 1726776706.86463: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11499 1726776706.86488: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11499 1726776706.86564: variable '__kernel_settings_register_profile' from source: set_fact 11499 1726776706.86585: Evaluated conditional (not __kernel_settings_register_profile is changed): True 11499 1726776706.86674: variable '__kernel_settings_register_mode' from source: set_fact 11499 1726776706.86686: Evaluated conditional (not __kernel_settings_register_mode is changed): True 11499 1726776706.86759: variable '__kernel_settings_register_apply' from source: set_fact 11499 1726776706.86770: Evaluated conditional (__kernel_settings_register_apply is changed): True 11499 1726776706.86777: variable 'omit' from source: magic vars 11499 1726776706.86810: variable 'omit' from source: magic vars 11499 1726776706.86888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11499 1726776706.88269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11499 1726776706.88319: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11499 1726776706.88348: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11499 1726776706.88372: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11499 1726776706.88392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11499 1726776706.88443: variable '__kernel_settings_active_profile' from source: set_fact 11499 1726776706.88468: variable 'omit' from source: magic vars 11499 1726776706.88489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11499 1726776706.88509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11499 1726776706.88524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11499 1726776706.88539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11499 1726776706.88549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11499 1726776706.88571: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11499 1726776706.88576: variable 'ansible_host' from source: host vars for 'managed_node2' 11499 1726776706.88580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11499 1726776706.88644: Set connection var ansible_connection to ssh 11499 1726776706.88652: Set connection var ansible_pipelining to False 11499 1726776706.88658: Set connection var ansible_timeout to 10 11499 1726776706.88665: Set connection var ansible_module_compression to ZIP_DEFLATED 11499 1726776706.88671: Set connection var ansible_shell_type to sh 11499 1726776706.88676: Set connection var ansible_shell_executable to /bin/sh 11499 1726776706.88691: variable 'ansible_shell_executable' from source: unknown 11499 1726776706.88694: variable 'ansible_connection' from source: unknown 11499 1726776706.88697: variable 'ansible_module_compression' from source: unknown 11499 1726776706.88701: variable 'ansible_shell_type' from source: unknown 11499 1726776706.88704: variable 'ansible_shell_executable' from source: unknown 11499 1726776706.88708: variable 'ansible_host' from source: host vars for 'managed_node2' 11499 1726776706.88712: variable 'ansible_pipelining' from source: unknown 11499 1726776706.88715: variable 'ansible_timeout' from source: unknown 11499 1726776706.88719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11499 1726776706.88784: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11499 1726776706.88795: variable 'omit' from source: magic vars 11499 1726776706.88800: starting attempt loop 11499 1726776706.88803: running the handler 11499 1726776706.88814: _low_level_execute_command(): starting 11499 1726776706.88820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11499 1726776706.91109: stdout chunk (state=2): >>>/root <<< 11499 1726776706.91230: stderr chunk (state=3): >>><<< 11499 1726776706.91236: stdout chunk (state=3): >>><<< 11499 1726776706.91252: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11499 1726776706.91262: _low_level_execute_command(): starting 11499 1726776706.91270: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123 `" && echo ansible-tmp-1726776706.9125898-11499-52846850979123="` echo /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123 `" ) && sleep 0' 11499 1726776706.93761: stdout chunk (state=2): >>>ansible-tmp-1726776706.9125898-11499-52846850979123=/root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123 <<< 11499 1726776706.93891: stderr chunk (state=3): >>><<< 11499 1726776706.93897: stdout chunk (state=3): >>><<< 11499 1726776706.93910: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776706.9125898-11499-52846850979123=/root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123 , stderr= 11499 1726776706.93934: variable 'ansible_module_compression' from source: unknown 11499 1726776706.93970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11499 1726776706.94002: variable 'ansible_facts' from source: unknown 11499 1726776706.94066: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123/AnsiballZ_command.py 11499 1726776706.94159: Sending initial data 11499 1726776706.94165: Sent initial data (154 bytes) 11499 1726776706.96639: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpiz4yyvlx /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123/AnsiballZ_command.py <<< 11499 1726776706.97706: stderr chunk (state=3): >>><<< 11499 1726776706.97713: stdout chunk (state=3): >>><<< 11499 1726776706.97733: done transferring module to remote 11499 1726776706.97744: _low_level_execute_command(): starting 11499 1726776706.97751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123/ /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123/AnsiballZ_command.py && sleep 0' 11499 1726776707.00112: stderr chunk (state=2): >>><<< 11499 1726776707.00120: stdout chunk (state=2): >>><<< 11499 1726776707.00136: _low_level_execute_command() done: rc=0, stdout=, stderr= 11499 1726776707.00140: _low_level_execute_command(): starting 11499 1726776707.00146: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123/AnsiballZ_command.py && sleep 0' 11499 1726776708.30817: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:47.152273", "end": "2024-09-19 16:11:48.306268", "delta": "0:00:01.153995", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11499 1726776708.32036: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11499 1726776708.32088: stderr chunk (state=3): >>><<< 11499 1726776708.32094: stdout chunk (state=3): >>><<< 11499 1726776708.32111: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 16:11:47.152273", "end": "2024-09-19 16:11:48.306268", "delta": "0:00:01.153995", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11499 1726776708.32140: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11499 1726776708.32151: _low_level_execute_command(): starting 11499 1726776708.32156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776706.9125898-11499-52846850979123/ > /dev/null 2>&1 && sleep 0' 11499 1726776708.34580: stderr chunk (state=2): >>><<< 11499 1726776708.34588: stdout chunk (state=2): >>><<< 11499 1726776708.34601: _low_level_execute_command() done: rc=0, stdout=, stderr= 11499 1726776708.34608: handler run complete 11499 1726776708.34625: Evaluated conditional (True): True 11499 1726776708.34636: attempt loop complete, returning result 11499 1726776708.34640: _execute() done 11499 1726776708.34644: dumping result to json 11499 1726776708.34649: done dumping result, returning 11499 1726776708.34656: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-cec2-986e-000000000ee5] 11499 1726776708.34662: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee5 11499 1726776708.34692: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee5 11499 1726776708.34696: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.153995", "end": "2024-09-19 16:11:48.306268", "rc": 0, "start": "2024-09-19 16:11:47.152273" } 8218 1726776708.34837: no more pending results, returning what we have 8218 1726776708.34841: results queue empty 8218 1726776708.34841: checking for any_errors_fatal 8218 1726776708.34849: done checking for any_errors_fatal 8218 1726776708.34850: checking for max_fail_percentage 8218 1726776708.34851: done checking for max_fail_percentage 8218 1726776708.34852: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.34853: done checking to see if all hosts have failed 8218 1726776708.34853: getting the remaining hosts for this loop 8218 1726776708.34854: done getting the remaining hosts for this loop 8218 1726776708.34857: getting the next task for host managed_node2 8218 1726776708.34863: done getting next task for host managed_node2 8218 1726776708.34866: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8218 1726776708.34870: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.34880: getting variables 8218 1726776708.34881: in VariableManager get_vars() 8218 1726776708.34914: Calling all_inventory to load vars for managed_node2 8218 1726776708.34917: Calling groups_inventory to load vars for managed_node2 8218 1726776708.34919: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.34927: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.34931: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.34934: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.35057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.35250: done with get_vars() 8218 1726776708.35258: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:11:48 -0400 (0:00:01.500) 0:01:34.183 **** 8218 1726776708.35326: entering _queue_task() for managed_node2/include_tasks 8218 1726776708.35483: worker is 1 (out of 1 available) 8218 1726776708.35499: exiting _queue_task() for managed_node2/include_tasks 8218 1726776708.35512: done queuing things up, now waiting for results queue to drain 8218 1726776708.35513: waiting for pending results... 11513 1726776708.35651: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11513 1726776708.35787: in run() - task 120fa90a-8a95-cec2-986e-000000000ee6 11513 1726776708.35803: variable 'ansible_search_path' from source: unknown 11513 1726776708.35808: variable 'ansible_search_path' from source: unknown 11513 1726776708.35837: calling self._execute() 11513 1726776708.35909: variable 'ansible_host' from source: host vars for 'managed_node2' 11513 1726776708.35916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11513 1726776708.35921: variable 'omit' from source: magic vars 11513 1726776708.36253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11513 1726776708.36445: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11513 1726776708.36483: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11513 1726776708.36511: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11513 1726776708.36543: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11513 1726776708.36638: variable '__kernel_settings_register_apply' from source: set_fact 11513 1726776708.36661: Evaluated conditional (__kernel_settings_register_apply is changed): True 11513 1726776708.36670: _execute() done 11513 1726776708.36674: dumping result to json 11513 1726776708.36678: done dumping result, returning 11513 1726776708.36685: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-cec2-986e-000000000ee6] 11513 1726776708.36691: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee6 11513 1726776708.36714: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee6 11513 1726776708.36718: WORKER PROCESS EXITING 8218 1726776708.36853: no more pending results, returning what we have 8218 1726776708.36857: in VariableManager get_vars() 8218 1726776708.36894: Calling all_inventory to load vars for managed_node2 8218 1726776708.36897: Calling groups_inventory to load vars for managed_node2 8218 1726776708.36899: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.36907: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.36909: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.36911: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.37019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.37141: done with get_vars() 8218 1726776708.37147: variable 'ansible_search_path' from source: unknown 8218 1726776708.37147: variable 'ansible_search_path' from source: unknown 8218 1726776708.37171: we have included files to process 8218 1726776708.37172: generating all_blocks data 8218 1726776708.37176: done generating all_blocks data 8218 1726776708.37180: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776708.37181: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8218 1726776708.37182: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8218 1726776708.37443: done processing included file 8218 1726776708.37446: iterating over new_blocks loaded from include file 8218 1726776708.37447: in VariableManager get_vars() 8218 1726776708.37464: done with get_vars() 8218 1726776708.37465: filtering new block on tags 8218 1726776708.37502: done filtering new block on tags 8218 1726776708.37504: done iterating over new_blocks loaded from include file 8218 1726776708.37504: extending task lists for all hosts with included blocks 8218 1726776708.38083: done extending task lists 8218 1726776708.38084: done processing included files 8218 1726776708.38085: results queue empty 8218 1726776708.38085: checking for any_errors_fatal 8218 1726776708.38088: done checking for any_errors_fatal 8218 1726776708.38089: checking for max_fail_percentage 8218 1726776708.38089: done checking for max_fail_percentage 8218 1726776708.38090: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.38090: done checking to see if all hosts have failed 8218 1726776708.38091: getting the remaining hosts for this loop 8218 1726776708.38091: done getting the remaining hosts for this loop 8218 1726776708.38093: getting the next task for host managed_node2 8218 1726776708.38096: done getting next task for host managed_node2 8218 1726776708.38097: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8218 1726776708.38100: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.38107: getting variables 8218 1726776708.38107: in VariableManager get_vars() 8218 1726776708.38116: Calling all_inventory to load vars for managed_node2 8218 1726776708.38117: Calling groups_inventory to load vars for managed_node2 8218 1726776708.38118: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.38122: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.38123: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.38124: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.38202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.38314: done with get_vars() 8218 1726776708.38320: done getting variables 8218 1726776708.38346: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 16:11:48 -0400 (0:00:00.030) 0:01:34.214 **** 8218 1726776708.38375: entering _queue_task() for managed_node2/command 8218 1726776708.38527: worker is 1 (out of 1 available) 8218 1726776708.38543: exiting _queue_task() for managed_node2/command 8218 1726776708.38554: done queuing things up, now waiting for results queue to drain 8218 1726776708.38555: waiting for pending results... 11514 1726776708.38689: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11514 1726776708.38816: in run() - task 120fa90a-8a95-cec2-986e-000000000fc5 11514 1726776708.38832: variable 'ansible_search_path' from source: unknown 11514 1726776708.38836: variable 'ansible_search_path' from source: unknown 11514 1726776708.38862: calling self._execute() 11514 1726776708.38925: variable 'ansible_host' from source: host vars for 'managed_node2' 11514 1726776708.38935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11514 1726776708.38944: variable 'omit' from source: magic vars 11514 1726776708.39016: variable 'omit' from source: magic vars 11514 1726776708.39064: variable 'omit' from source: magic vars 11514 1726776708.39087: variable 'omit' from source: magic vars 11514 1726776708.39121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11514 1726776708.39149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11514 1726776708.39169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11514 1726776708.39184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11514 1726776708.39195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11514 1726776708.39218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11514 1726776708.39224: variable 'ansible_host' from source: host vars for 'managed_node2' 11514 1726776708.39229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11514 1726776708.39294: Set connection var ansible_connection to ssh 11514 1726776708.39303: Set connection var ansible_pipelining to False 11514 1726776708.39310: Set connection var ansible_timeout to 10 11514 1726776708.39317: Set connection var ansible_module_compression to ZIP_DEFLATED 11514 1726776708.39323: Set connection var ansible_shell_type to sh 11514 1726776708.39329: Set connection var ansible_shell_executable to /bin/sh 11514 1726776708.39346: variable 'ansible_shell_executable' from source: unknown 11514 1726776708.39350: variable 'ansible_connection' from source: unknown 11514 1726776708.39354: variable 'ansible_module_compression' from source: unknown 11514 1726776708.39358: variable 'ansible_shell_type' from source: unknown 11514 1726776708.39361: variable 'ansible_shell_executable' from source: unknown 11514 1726776708.39365: variable 'ansible_host' from source: host vars for 'managed_node2' 11514 1726776708.39369: variable 'ansible_pipelining' from source: unknown 11514 1726776708.39372: variable 'ansible_timeout' from source: unknown 11514 1726776708.39376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11514 1726776708.39467: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11514 1726776708.39480: variable 'omit' from source: magic vars 11514 1726776708.39485: starting attempt loop 11514 1726776708.39489: running the handler 11514 1726776708.39501: _low_level_execute_command(): starting 11514 1726776708.39508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11514 1726776708.41843: stdout chunk (state=2): >>>/root <<< 11514 1726776708.41960: stderr chunk (state=3): >>><<< 11514 1726776708.41966: stdout chunk (state=3): >>><<< 11514 1726776708.41983: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11514 1726776708.41998: _low_level_execute_command(): starting 11514 1726776708.42003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872 `" && echo ansible-tmp-1726776708.4199314-11514-160334021543872="` echo /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872 `" ) && sleep 0' 11514 1726776708.44805: stdout chunk (state=2): >>>ansible-tmp-1726776708.4199314-11514-160334021543872=/root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872 <<< 11514 1726776708.44943: stderr chunk (state=3): >>><<< 11514 1726776708.44952: stdout chunk (state=3): >>><<< 11514 1726776708.44970: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776708.4199314-11514-160334021543872=/root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872 , stderr= 11514 1726776708.44996: variable 'ansible_module_compression' from source: unknown 11514 1726776708.45042: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11514 1726776708.45081: variable 'ansible_facts' from source: unknown 11514 1726776708.45151: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872/AnsiballZ_command.py 11514 1726776708.45254: Sending initial data 11514 1726776708.45261: Sent initial data (155 bytes) 11514 1726776708.47777: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpx_qk_nng /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872/AnsiballZ_command.py <<< 11514 1726776708.48836: stderr chunk (state=3): >>><<< 11514 1726776708.48842: stdout chunk (state=3): >>><<< 11514 1726776708.48861: done transferring module to remote 11514 1726776708.48871: _low_level_execute_command(): starting 11514 1726776708.48876: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872/ /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872/AnsiballZ_command.py && sleep 0' 11514 1726776708.51249: stderr chunk (state=2): >>><<< 11514 1726776708.51255: stdout chunk (state=2): >>><<< 11514 1726776708.51270: _low_level_execute_command() done: rc=0, stdout=, stderr= 11514 1726776708.51274: _low_level_execute_command(): starting 11514 1726776708.51279: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872/AnsiballZ_command.py && sleep 0' 11514 1726776708.77442: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:48.666992", "end": "2024-09-19 16:11:48.772555", "delta": "0:00:00.105563", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11514 1726776708.78656: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11514 1726776708.78705: stderr chunk (state=3): >>><<< 11514 1726776708.78711: stdout chunk (state=3): >>><<< 11514 1726776708.78731: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:11:48.666992", "end": "2024-09-19 16:11:48.772555", "delta": "0:00:00.105563", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11514 1726776708.78775: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11514 1726776708.78786: _low_level_execute_command(): starting 11514 1726776708.78793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776708.4199314-11514-160334021543872/ > /dev/null 2>&1 && sleep 0' 11514 1726776708.81231: stderr chunk (state=2): >>><<< 11514 1726776708.81237: stdout chunk (state=2): >>><<< 11514 1726776708.81250: _low_level_execute_command() done: rc=0, stdout=, stderr= 11514 1726776708.81257: handler run complete 11514 1726776708.81278: Evaluated conditional (False): False 11514 1726776708.81288: attempt loop complete, returning result 11514 1726776708.81292: _execute() done 11514 1726776708.81295: dumping result to json 11514 1726776708.81300: done dumping result, returning 11514 1726776708.81307: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [120fa90a-8a95-cec2-986e-000000000fc5] 11514 1726776708.81313: sending task result for task 120fa90a-8a95-cec2-986e-000000000fc5 11514 1726776708.81344: done sending task result for task 120fa90a-8a95-cec2-986e-000000000fc5 11514 1726776708.81348: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.105563", "end": "2024-09-19 16:11:48.772555", "rc": 0, "start": "2024-09-19 16:11:48.666992" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8218 1726776708.81504: no more pending results, returning what we have 8218 1726776708.81507: results queue empty 8218 1726776708.81508: checking for any_errors_fatal 8218 1726776708.81510: done checking for any_errors_fatal 8218 1726776708.81511: checking for max_fail_percentage 8218 1726776708.81512: done checking for max_fail_percentage 8218 1726776708.81513: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.81513: done checking to see if all hosts have failed 8218 1726776708.81514: getting the remaining hosts for this loop 8218 1726776708.81515: done getting the remaining hosts for this loop 8218 1726776708.81518: getting the next task for host managed_node2 8218 1726776708.81525: done getting next task for host managed_node2 8218 1726776708.81528: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8218 1726776708.81534: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.81544: getting variables 8218 1726776708.81545: in VariableManager get_vars() 8218 1726776708.81579: Calling all_inventory to load vars for managed_node2 8218 1726776708.81582: Calling groups_inventory to load vars for managed_node2 8218 1726776708.81584: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.81592: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.81594: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.81596: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.81706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.81865: done with get_vars() 8218 1726776708.81874: done getting variables 8218 1726776708.81916: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 16:11:48 -0400 (0:00:00.435) 0:01:34.649 **** 8218 1726776708.81943: entering _queue_task() for managed_node2/shell 8218 1726776708.82098: worker is 1 (out of 1 available) 8218 1726776708.82112: exiting _queue_task() for managed_node2/shell 8218 1726776708.82124: done queuing things up, now waiting for results queue to drain 8218 1726776708.82125: waiting for pending results... 11522 1726776708.82262: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11522 1726776708.82394: in run() - task 120fa90a-8a95-cec2-986e-000000000fc6 11522 1726776708.82411: variable 'ansible_search_path' from source: unknown 11522 1726776708.82415: variable 'ansible_search_path' from source: unknown 11522 1726776708.82444: calling self._execute() 11522 1726776708.82514: variable 'ansible_host' from source: host vars for 'managed_node2' 11522 1726776708.82522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11522 1726776708.82533: variable 'omit' from source: magic vars 11522 1726776708.82858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11522 1726776708.83041: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11522 1726776708.83077: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11522 1726776708.83105: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11522 1726776708.83136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11522 1726776708.83222: variable '__kernel_settings_register_verify_values' from source: set_fact 11522 1726776708.83246: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11522 1726776708.83251: when evaluation is False, skipping this task 11522 1726776708.83254: _execute() done 11522 1726776708.83258: dumping result to json 11522 1726776708.83262: done dumping result, returning 11522 1726776708.83270: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [120fa90a-8a95-cec2-986e-000000000fc6] 11522 1726776708.83277: sending task result for task 120fa90a-8a95-cec2-986e-000000000fc6 11522 1726776708.83298: done sending task result for task 120fa90a-8a95-cec2-986e-000000000fc6 11522 1726776708.83301: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776708.83401: no more pending results, returning what we have 8218 1726776708.83404: results queue empty 8218 1726776708.83405: checking for any_errors_fatal 8218 1726776708.83413: done checking for any_errors_fatal 8218 1726776708.83414: checking for max_fail_percentage 8218 1726776708.83415: done checking for max_fail_percentage 8218 1726776708.83416: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.83416: done checking to see if all hosts have failed 8218 1726776708.83417: getting the remaining hosts for this loop 8218 1726776708.83418: done getting the remaining hosts for this loop 8218 1726776708.83421: getting the next task for host managed_node2 8218 1726776708.83428: done getting next task for host managed_node2 8218 1726776708.83432: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8218 1726776708.83437: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.83452: getting variables 8218 1726776708.83454: in VariableManager get_vars() 8218 1726776708.83483: Calling all_inventory to load vars for managed_node2 8218 1726776708.83485: Calling groups_inventory to load vars for managed_node2 8218 1726776708.83487: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.83494: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.83496: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.83498: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.83600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.83719: done with get_vars() 8218 1726776708.83727: done getting variables 8218 1726776708.83768: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 16:11:48 -0400 (0:00:00.018) 0:01:34.668 **** 8218 1726776708.83791: entering _queue_task() for managed_node2/fail 8218 1726776708.83939: worker is 1 (out of 1 available) 8218 1726776708.83952: exiting _queue_task() for managed_node2/fail 8218 1726776708.83962: done queuing things up, now waiting for results queue to drain 8218 1726776708.83963: waiting for pending results... 11523 1726776708.84090: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11523 1726776708.84215: in run() - task 120fa90a-8a95-cec2-986e-000000000fc7 11523 1726776708.84232: variable 'ansible_search_path' from source: unknown 11523 1726776708.84236: variable 'ansible_search_path' from source: unknown 11523 1726776708.84261: calling self._execute() 11523 1726776708.84325: variable 'ansible_host' from source: host vars for 'managed_node2' 11523 1726776708.84335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11523 1726776708.84344: variable 'omit' from source: magic vars 11523 1726776708.84661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11523 1726776708.84897: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11523 1726776708.84931: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11523 1726776708.84958: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11523 1726776708.84986: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11523 1726776708.85072: variable '__kernel_settings_register_verify_values' from source: set_fact 11523 1726776708.85094: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11523 1726776708.85099: when evaluation is False, skipping this task 11523 1726776708.85102: _execute() done 11523 1726776708.85106: dumping result to json 11523 1726776708.85110: done dumping result, returning 11523 1726776708.85116: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [120fa90a-8a95-cec2-986e-000000000fc7] 11523 1726776708.85123: sending task result for task 120fa90a-8a95-cec2-986e-000000000fc7 11523 1726776708.85148: done sending task result for task 120fa90a-8a95-cec2-986e-000000000fc7 11523 1726776708.85152: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8218 1726776708.85253: no more pending results, returning what we have 8218 1726776708.85256: results queue empty 8218 1726776708.85257: checking for any_errors_fatal 8218 1726776708.85262: done checking for any_errors_fatal 8218 1726776708.85263: checking for max_fail_percentage 8218 1726776708.85265: done checking for max_fail_percentage 8218 1726776708.85265: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.85266: done checking to see if all hosts have failed 8218 1726776708.85267: getting the remaining hosts for this loop 8218 1726776708.85268: done getting the remaining hosts for this loop 8218 1726776708.85270: getting the next task for host managed_node2 8218 1726776708.85278: done getting next task for host managed_node2 8218 1726776708.85281: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8218 1726776708.85284: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.85300: getting variables 8218 1726776708.85301: in VariableManager get_vars() 8218 1726776708.85337: Calling all_inventory to load vars for managed_node2 8218 1726776708.85340: Calling groups_inventory to load vars for managed_node2 8218 1726776708.85342: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.85349: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.85351: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.85354: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.85501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.85613: done with get_vars() 8218 1726776708.85620: done getting variables 8218 1726776708.85660: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:11:48 -0400 (0:00:00.018) 0:01:34.687 **** 8218 1726776708.85682: entering _queue_task() for managed_node2/set_fact 8218 1726776708.85833: worker is 1 (out of 1 available) 8218 1726776708.85847: exiting _queue_task() for managed_node2/set_fact 8218 1726776708.85859: done queuing things up, now waiting for results queue to drain 8218 1726776708.85861: waiting for pending results... 11524 1726776708.85990: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11524 1726776708.86102: in run() - task 120fa90a-8a95-cec2-986e-000000000ee7 11524 1726776708.86117: variable 'ansible_search_path' from source: unknown 11524 1726776708.86121: variable 'ansible_search_path' from source: unknown 11524 1726776708.86149: calling self._execute() 11524 1726776708.86216: variable 'ansible_host' from source: host vars for 'managed_node2' 11524 1726776708.86226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11524 1726776708.86236: variable 'omit' from source: magic vars 11524 1726776708.86312: variable 'omit' from source: magic vars 11524 1726776708.86353: variable 'omit' from source: magic vars 11524 1726776708.86377: variable 'omit' from source: magic vars 11524 1726776708.86411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11524 1726776708.86440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11524 1726776708.86459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11524 1726776708.86474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11524 1726776708.86486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11524 1726776708.86508: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11524 1726776708.86513: variable 'ansible_host' from source: host vars for 'managed_node2' 11524 1726776708.86517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11524 1726776708.86585: Set connection var ansible_connection to ssh 11524 1726776708.86592: Set connection var ansible_pipelining to False 11524 1726776708.86599: Set connection var ansible_timeout to 10 11524 1726776708.86607: Set connection var ansible_module_compression to ZIP_DEFLATED 11524 1726776708.86612: Set connection var ansible_shell_type to sh 11524 1726776708.86618: Set connection var ansible_shell_executable to /bin/sh 11524 1726776708.86636: variable 'ansible_shell_executable' from source: unknown 11524 1726776708.86641: variable 'ansible_connection' from source: unknown 11524 1726776708.86644: variable 'ansible_module_compression' from source: unknown 11524 1726776708.86648: variable 'ansible_shell_type' from source: unknown 11524 1726776708.86651: variable 'ansible_shell_executable' from source: unknown 11524 1726776708.86655: variable 'ansible_host' from source: host vars for 'managed_node2' 11524 1726776708.86659: variable 'ansible_pipelining' from source: unknown 11524 1726776708.86662: variable 'ansible_timeout' from source: unknown 11524 1726776708.86666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11524 1726776708.86757: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11524 1726776708.86770: variable 'omit' from source: magic vars 11524 1726776708.86776: starting attempt loop 11524 1726776708.86779: running the handler 11524 1726776708.86789: handler run complete 11524 1726776708.86798: attempt loop complete, returning result 11524 1726776708.86801: _execute() done 11524 1726776708.86804: dumping result to json 11524 1726776708.86808: done dumping result, returning 11524 1726776708.86814: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-cec2-986e-000000000ee7] 11524 1726776708.86820: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee7 11524 1726776708.86845: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee7 11524 1726776708.86848: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8218 1726776708.87001: no more pending results, returning what we have 8218 1726776708.87004: results queue empty 8218 1726776708.87005: checking for any_errors_fatal 8218 1726776708.87009: done checking for any_errors_fatal 8218 1726776708.87010: checking for max_fail_percentage 8218 1726776708.87011: done checking for max_fail_percentage 8218 1726776708.87011: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.87012: done checking to see if all hosts have failed 8218 1726776708.87012: getting the remaining hosts for this loop 8218 1726776708.87013: done getting the remaining hosts for this loop 8218 1726776708.87015: getting the next task for host managed_node2 8218 1726776708.87019: done getting next task for host managed_node2 8218 1726776708.87021: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8218 1726776708.87025: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.87033: getting variables 8218 1726776708.87034: in VariableManager get_vars() 8218 1726776708.87058: Calling all_inventory to load vars for managed_node2 8218 1726776708.87060: Calling groups_inventory to load vars for managed_node2 8218 1726776708.87061: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.87068: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.87070: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.87072: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.87178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.87297: done with get_vars() 8218 1726776708.87304: done getting variables 8218 1726776708.87345: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:11:48 -0400 (0:00:00.016) 0:01:34.704 **** 8218 1726776708.87368: entering _queue_task() for managed_node2/set_fact 8218 1726776708.87520: worker is 1 (out of 1 available) 8218 1726776708.87535: exiting _queue_task() for managed_node2/set_fact 8218 1726776708.87547: done queuing things up, now waiting for results queue to drain 8218 1726776708.87548: waiting for pending results... 11525 1726776708.87677: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11525 1726776708.87795: in run() - task 120fa90a-8a95-cec2-986e-000000000ee8 11525 1726776708.87810: variable 'ansible_search_path' from source: unknown 11525 1726776708.87814: variable 'ansible_search_path' from source: unknown 11525 1726776708.87842: calling self._execute() 11525 1726776708.87912: variable 'ansible_host' from source: host vars for 'managed_node2' 11525 1726776708.87921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11525 1726776708.87932: variable 'omit' from source: magic vars 11525 1726776708.88004: variable 'omit' from source: magic vars 11525 1726776708.88042: variable 'omit' from source: magic vars 11525 1726776708.88301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11525 1726776708.88540: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11525 1726776708.88574: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11525 1726776708.88600: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11525 1726776708.88626: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11525 1726776708.88726: variable '__kernel_settings_register_profile' from source: set_fact 11525 1726776708.88742: variable '__kernel_settings_register_mode' from source: set_fact 11525 1726776708.88750: variable '__kernel_settings_register_apply' from source: set_fact 11525 1726776708.88787: variable 'omit' from source: magic vars 11525 1726776708.88808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11525 1726776708.88830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11525 1726776708.88847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11525 1726776708.88861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11525 1726776708.88871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11525 1726776708.88893: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11525 1726776708.88898: variable 'ansible_host' from source: host vars for 'managed_node2' 11525 1726776708.88902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11525 1726776708.88966: Set connection var ansible_connection to ssh 11525 1726776708.88975: Set connection var ansible_pipelining to False 11525 1726776708.88982: Set connection var ansible_timeout to 10 11525 1726776708.88989: Set connection var ansible_module_compression to ZIP_DEFLATED 11525 1726776708.88994: Set connection var ansible_shell_type to sh 11525 1726776708.88999: Set connection var ansible_shell_executable to /bin/sh 11525 1726776708.89013: variable 'ansible_shell_executable' from source: unknown 11525 1726776708.89017: variable 'ansible_connection' from source: unknown 11525 1726776708.89020: variable 'ansible_module_compression' from source: unknown 11525 1726776708.89024: variable 'ansible_shell_type' from source: unknown 11525 1726776708.89027: variable 'ansible_shell_executable' from source: unknown 11525 1726776708.89033: variable 'ansible_host' from source: host vars for 'managed_node2' 11525 1726776708.89037: variable 'ansible_pipelining' from source: unknown 11525 1726776708.89040: variable 'ansible_timeout' from source: unknown 11525 1726776708.89044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11525 1726776708.89110: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11525 1726776708.89121: variable 'omit' from source: magic vars 11525 1726776708.89127: starting attempt loop 11525 1726776708.89132: running the handler 11525 1726776708.89142: handler run complete 11525 1726776708.89150: attempt loop complete, returning result 11525 1726776708.89153: _execute() done 11525 1726776708.89155: dumping result to json 11525 1726776708.89159: done dumping result, returning 11525 1726776708.89166: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-cec2-986e-000000000ee8] 11525 1726776708.89172: sending task result for task 120fa90a-8a95-cec2-986e-000000000ee8 11525 1726776708.89192: done sending task result for task 120fa90a-8a95-cec2-986e-000000000ee8 11525 1726776708.89195: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8218 1726776708.89348: no more pending results, returning what we have 8218 1726776708.89350: results queue empty 8218 1726776708.89351: checking for any_errors_fatal 8218 1726776708.89357: done checking for any_errors_fatal 8218 1726776708.89357: checking for max_fail_percentage 8218 1726776708.89359: done checking for max_fail_percentage 8218 1726776708.89359: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.89360: done checking to see if all hosts have failed 8218 1726776708.89360: getting the remaining hosts for this loop 8218 1726776708.89361: done getting the remaining hosts for this loop 8218 1726776708.89364: getting the next task for host managed_node2 8218 1726776708.89374: done getting next task for host managed_node2 8218 1726776708.89376: ^ task is: TASK: meta (role_complete) 8218 1726776708.89379: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.89387: getting variables 8218 1726776708.89388: in VariableManager get_vars() 8218 1726776708.89413: Calling all_inventory to load vars for managed_node2 8218 1726776708.89415: Calling groups_inventory to load vars for managed_node2 8218 1726776708.89416: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.89423: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.89425: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.89426: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.89535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.89691: done with get_vars() 8218 1726776708.89698: done getting variables 8218 1726776708.89752: done queuing things up, now waiting for results queue to drain 8218 1726776708.89757: results queue empty 8218 1726776708.89757: checking for any_errors_fatal 8218 1726776708.89760: done checking for any_errors_fatal 8218 1726776708.89761: checking for max_fail_percentage 8218 1726776708.89761: done checking for max_fail_percentage 8218 1726776708.89762: checking to see if all hosts have failed and the running result is not ok 8218 1726776708.89762: done checking to see if all hosts have failed 8218 1726776708.89762: getting the remaining hosts for this loop 8218 1726776708.89763: done getting the remaining hosts for this loop 8218 1726776708.89764: getting the next task for host managed_node2 8218 1726776708.89769: done getting next task for host managed_node2 8218 1726776708.89770: ^ task is: TASK: Verify no settings 8218 1726776708.89771: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776708.89773: getting variables 8218 1726776708.89774: in VariableManager get_vars() 8218 1726776708.89781: Calling all_inventory to load vars for managed_node2 8218 1726776708.89782: Calling groups_inventory to load vars for managed_node2 8218 1726776708.89783: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776708.89786: Calling all_plugins_play to load vars for managed_node2 8218 1726776708.89787: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776708.89789: Calling groups_plugins_play to load vars for managed_node2 8218 1726776708.89865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776708.89970: done with get_vars() 8218 1726776708.89976: done getting variables 8218 1726776708.90000: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify no settings] ****************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Thursday 19 September 2024 16:11:48 -0400 (0:00:00.026) 0:01:34.730 **** 8218 1726776708.90018: entering _queue_task() for managed_node2/shell 8218 1726776708.90164: worker is 1 (out of 1 available) 8218 1726776708.90181: exiting _queue_task() for managed_node2/shell 8218 1726776708.90192: done queuing things up, now waiting for results queue to drain 8218 1726776708.90193: waiting for pending results... 11526 1726776708.90318: running TaskExecutor() for managed_node2/TASK: Verify no settings 11526 1726776708.90412: in run() - task 120fa90a-8a95-cec2-986e-000000000cae 11526 1726776708.90428: variable 'ansible_search_path' from source: unknown 11526 1726776708.90433: variable 'ansible_search_path' from source: unknown 11526 1726776708.90459: calling self._execute() 11526 1726776708.90523: variable 'ansible_host' from source: host vars for 'managed_node2' 11526 1726776708.90533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11526 1726776708.90542: variable 'omit' from source: magic vars 11526 1726776708.90616: variable 'omit' from source: magic vars 11526 1726776708.90648: variable 'omit' from source: magic vars 11526 1726776708.90879: variable '__kernel_settings_profile_filename' from source: role '' exported vars 11526 1726776708.90934: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11526 1726776708.90996: variable '__kernel_settings_profile_parent' from source: set_fact 11526 1726776708.91005: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11526 1726776708.91040: variable 'omit' from source: magic vars 11526 1726776708.91070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11526 1726776708.91096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11526 1726776708.91114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11526 1726776708.91130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11526 1726776708.91141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11526 1726776708.91163: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11526 1726776708.91168: variable 'ansible_host' from source: host vars for 'managed_node2' 11526 1726776708.91172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11526 1726776708.91235: Set connection var ansible_connection to ssh 11526 1726776708.91243: Set connection var ansible_pipelining to False 11526 1726776708.91249: Set connection var ansible_timeout to 10 11526 1726776708.91256: Set connection var ansible_module_compression to ZIP_DEFLATED 11526 1726776708.91262: Set connection var ansible_shell_type to sh 11526 1726776708.91268: Set connection var ansible_shell_executable to /bin/sh 11526 1726776708.91285: variable 'ansible_shell_executable' from source: unknown 11526 1726776708.91289: variable 'ansible_connection' from source: unknown 11526 1726776708.91293: variable 'ansible_module_compression' from source: unknown 11526 1726776708.91296: variable 'ansible_shell_type' from source: unknown 11526 1726776708.91299: variable 'ansible_shell_executable' from source: unknown 11526 1726776708.91303: variable 'ansible_host' from source: host vars for 'managed_node2' 11526 1726776708.91307: variable 'ansible_pipelining' from source: unknown 11526 1726776708.91310: variable 'ansible_timeout' from source: unknown 11526 1726776708.91314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11526 1726776708.91399: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11526 1726776708.91412: variable 'omit' from source: magic vars 11526 1726776708.91417: starting attempt loop 11526 1726776708.91421: running the handler 11526 1726776708.91431: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11526 1726776708.91445: _low_level_execute_command(): starting 11526 1726776708.91453: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11526 1726776708.93783: stdout chunk (state=2): >>>/root <<< 11526 1726776708.93902: stderr chunk (state=3): >>><<< 11526 1726776708.93909: stdout chunk (state=3): >>><<< 11526 1726776708.93925: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11526 1726776708.93938: _low_level_execute_command(): starting 11526 1726776708.93944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710 `" && echo ansible-tmp-1726776708.9393353-11526-7424189956710="` echo /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710 `" ) && sleep 0' 11526 1726776708.96563: stdout chunk (state=2): >>>ansible-tmp-1726776708.9393353-11526-7424189956710=/root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710 <<< 11526 1726776708.96691: stderr chunk (state=3): >>><<< 11526 1726776708.96697: stdout chunk (state=3): >>><<< 11526 1726776708.96709: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776708.9393353-11526-7424189956710=/root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710 , stderr= 11526 1726776708.96732: variable 'ansible_module_compression' from source: unknown 11526 1726776708.96772: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11526 1726776708.96803: variable 'ansible_facts' from source: unknown 11526 1726776708.96869: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710/AnsiballZ_command.py 11526 1726776708.97015: Sending initial data 11526 1726776708.97022: Sent initial data (153 bytes) 11526 1726776708.99526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmporj3ckax /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710/AnsiballZ_command.py <<< 11526 1726776709.00834: stderr chunk (state=3): >>><<< 11526 1726776709.00841: stdout chunk (state=3): >>><<< 11526 1726776709.00862: done transferring module to remote 11526 1726776709.00875: _low_level_execute_command(): starting 11526 1726776709.00881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710/ /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710/AnsiballZ_command.py && sleep 0' 11526 1726776709.03260: stderr chunk (state=2): >>><<< 11526 1726776709.03269: stdout chunk (state=2): >>><<< 11526 1726776709.03281: _low_level_execute_command() done: rc=0, stdout=, stderr= 11526 1726776709.03285: _low_level_execute_command(): starting 11526 1726776709.03290: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710/AnsiballZ_command.py && sleep 0' 11526 1726776709.19446: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 16:11:49.185243", "end": "2024-09-19 16:11:49.192741", "delta": "0:00:00.007498", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11526 1726776709.20616: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11526 1726776709.20664: stderr chunk (state=3): >>><<< 11526 1726776709.20673: stdout chunk (state=3): >>><<< 11526 1726776709.20690: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 16:11:49.185243", "end": "2024-09-19 16:11:49.192741", "delta": "0:00:00.007498", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11526 1726776709.20721: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\["$section"\\\\] "$conf"; then\n echo ERROR: "$section" settings present\n rc=1\n fi\ndone\nexit "$rc"\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11526 1726776709.20732: _low_level_execute_command(): starting 11526 1726776709.20738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776708.9393353-11526-7424189956710/ > /dev/null 2>&1 && sleep 0' 11526 1726776709.23166: stderr chunk (state=2): >>><<< 11526 1726776709.23175: stdout chunk (state=2): >>><<< 11526 1726776709.23189: _low_level_execute_command() done: rc=0, stdout=, stderr= 11526 1726776709.23197: handler run complete 11526 1726776709.23215: Evaluated conditional (False): False 11526 1726776709.23224: attempt loop complete, returning result 11526 1726776709.23227: _execute() done 11526 1726776709.23232: dumping result to json 11526 1726776709.23238: done dumping result, returning 11526 1726776709.23245: done running TaskExecutor() for managed_node2/TASK: Verify no settings [120fa90a-8a95-cec2-986e-000000000cae] 11526 1726776709.23252: sending task result for task 120fa90a-8a95-cec2-986e-000000000cae 11526 1726776709.23285: done sending task result for task 120fa90a-8a95-cec2-986e-000000000cae 11526 1726776709.23289: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "delta": "0:00:00.007498", "end": "2024-09-19 16:11:49.192741", "rc": 0, "start": "2024-09-19 16:11:49.185243" } STDERR: + exec + rc=0 + conf=/etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf + exit 0 8218 1726776709.23494: no more pending results, returning what we have 8218 1726776709.23497: results queue empty 8218 1726776709.23498: checking for any_errors_fatal 8218 1726776709.23500: done checking for any_errors_fatal 8218 1726776709.23500: checking for max_fail_percentage 8218 1726776709.23502: done checking for max_fail_percentage 8218 1726776709.23502: checking to see if all hosts have failed and the running result is not ok 8218 1726776709.23503: done checking to see if all hosts have failed 8218 1726776709.23504: getting the remaining hosts for this loop 8218 1726776709.23505: done getting the remaining hosts for this loop 8218 1726776709.23508: getting the next task for host managed_node2 8218 1726776709.23515: done getting next task for host managed_node2 8218 1726776709.23517: ^ task is: TASK: Remove kernel_settings tuned profile 8218 1726776709.23518: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776709.23521: getting variables 8218 1726776709.23522: in VariableManager get_vars() 8218 1726776709.23552: Calling all_inventory to load vars for managed_node2 8218 1726776709.23554: Calling groups_inventory to load vars for managed_node2 8218 1726776709.23555: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776709.23562: Calling all_plugins_play to load vars for managed_node2 8218 1726776709.23570: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776709.23572: Calling groups_plugins_play to load vars for managed_node2 8218 1726776709.23681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776709.23842: done with get_vars() 8218 1726776709.23850: done getting variables TASK [Remove kernel_settings tuned profile] ************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Thursday 19 September 2024 16:11:49 -0400 (0:00:00.338) 0:01:35.069 **** 8218 1726776709.23917: entering _queue_task() for managed_node2/file 8218 1726776709.24082: worker is 1 (out of 1 available) 8218 1726776709.24097: exiting _queue_task() for managed_node2/file 8218 1726776709.24109: done queuing things up, now waiting for results queue to drain 8218 1726776709.24111: waiting for pending results... 11537 1726776709.24246: running TaskExecutor() for managed_node2/TASK: Remove kernel_settings tuned profile 11537 1726776709.24349: in run() - task 120fa90a-8a95-cec2-986e-000000000caf 11537 1726776709.24365: variable 'ansible_search_path' from source: unknown 11537 1726776709.24370: variable 'ansible_search_path' from source: unknown 11537 1726776709.24399: calling self._execute() 11537 1726776709.24471: variable 'ansible_host' from source: host vars for 'managed_node2' 11537 1726776709.24480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11537 1726776709.24489: variable 'omit' from source: magic vars 11537 1726776709.24564: variable 'omit' from source: magic vars 11537 1726776709.24593: variable 'omit' from source: magic vars 11537 1726776709.24611: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11537 1726776709.24827: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11537 1726776709.24900: variable '__kernel_settings_profile_parent' from source: set_fact 11537 1726776709.24907: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11537 1726776709.24943: variable 'omit' from source: magic vars 11537 1726776709.24976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11537 1726776709.25003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11537 1726776709.25021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11537 1726776709.25037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11537 1726776709.25048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11537 1726776709.25072: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11537 1726776709.25077: variable 'ansible_host' from source: host vars for 'managed_node2' 11537 1726776709.25082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11537 1726776709.25148: Set connection var ansible_connection to ssh 11537 1726776709.25156: Set connection var ansible_pipelining to False 11537 1726776709.25163: Set connection var ansible_timeout to 10 11537 1726776709.25170: Set connection var ansible_module_compression to ZIP_DEFLATED 11537 1726776709.25176: Set connection var ansible_shell_type to sh 11537 1726776709.25181: Set connection var ansible_shell_executable to /bin/sh 11537 1726776709.25196: variable 'ansible_shell_executable' from source: unknown 11537 1726776709.25200: variable 'ansible_connection' from source: unknown 11537 1726776709.25203: variable 'ansible_module_compression' from source: unknown 11537 1726776709.25206: variable 'ansible_shell_type' from source: unknown 11537 1726776709.25209: variable 'ansible_shell_executable' from source: unknown 11537 1726776709.25212: variable 'ansible_host' from source: host vars for 'managed_node2' 11537 1726776709.25214: variable 'ansible_pipelining' from source: unknown 11537 1726776709.25216: variable 'ansible_timeout' from source: unknown 11537 1726776709.25218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11537 1726776709.25359: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11537 1726776709.25372: variable 'omit' from source: magic vars 11537 1726776709.25378: starting attempt loop 11537 1726776709.25382: running the handler 11537 1726776709.25393: _low_level_execute_command(): starting 11537 1726776709.25400: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11537 1726776709.27735: stdout chunk (state=2): >>>/root <<< 11537 1726776709.27855: stderr chunk (state=3): >>><<< 11537 1726776709.27862: stdout chunk (state=3): >>><<< 11537 1726776709.27883: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11537 1726776709.27898: _low_level_execute_command(): starting 11537 1726776709.27904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715 `" && echo ansible-tmp-1726776709.2789273-11537-121140343657715="` echo /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715 `" ) && sleep 0' 11537 1726776709.30485: stdout chunk (state=2): >>>ansible-tmp-1726776709.2789273-11537-121140343657715=/root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715 <<< 11537 1726776709.30604: stderr chunk (state=3): >>><<< 11537 1726776709.30610: stdout chunk (state=3): >>><<< 11537 1726776709.30625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776709.2789273-11537-121140343657715=/root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715 , stderr= 11537 1726776709.30661: variable 'ansible_module_compression' from source: unknown 11537 1726776709.30705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11537 1726776709.30739: variable 'ansible_facts' from source: unknown 11537 1726776709.30808: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715/AnsiballZ_file.py 11537 1726776709.30903: Sending initial data 11537 1726776709.30910: Sent initial data (152 bytes) 11537 1726776709.33400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpw68t2qy5 /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715/AnsiballZ_file.py <<< 11537 1726776709.34474: stderr chunk (state=3): >>><<< 11537 1726776709.34480: stdout chunk (state=3): >>><<< 11537 1726776709.34497: done transferring module to remote 11537 1726776709.34507: _low_level_execute_command(): starting 11537 1726776709.34512: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715/ /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715/AnsiballZ_file.py && sleep 0' 11537 1726776709.36875: stderr chunk (state=2): >>><<< 11537 1726776709.36883: stdout chunk (state=2): >>><<< 11537 1726776709.36896: _low_level_execute_command() done: rc=0, stdout=, stderr= 11537 1726776709.36901: _low_level_execute_command(): starting 11537 1726776709.36906: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715/AnsiballZ_file.py && sleep 0' 11537 1726776709.52843: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11537 1726776709.53923: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11537 1726776709.53973: stderr chunk (state=3): >>><<< 11537 1726776709.53979: stdout chunk (state=3): >>><<< 11537 1726776709.53997: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11537 1726776709.54027: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11537 1726776709.54040: _low_level_execute_command(): starting 11537 1726776709.54046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776709.2789273-11537-121140343657715/ > /dev/null 2>&1 && sleep 0' 11537 1726776709.56462: stderr chunk (state=2): >>><<< 11537 1726776709.56471: stdout chunk (state=2): >>><<< 11537 1726776709.56485: _low_level_execute_command() done: rc=0, stdout=, stderr= 11537 1726776709.56492: handler run complete 11537 1726776709.56515: attempt loop complete, returning result 11537 1726776709.56519: _execute() done 11537 1726776709.56522: dumping result to json 11537 1726776709.56528: done dumping result, returning 11537 1726776709.56535: done running TaskExecutor() for managed_node2/TASK: Remove kernel_settings tuned profile [120fa90a-8a95-cec2-986e-000000000caf] 11537 1726776709.56542: sending task result for task 120fa90a-8a95-cec2-986e-000000000caf 11537 1726776709.56574: done sending task result for task 120fa90a-8a95-cec2-986e-000000000caf 11537 1726776709.56578: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "path": "/etc/tuned/kernel_settings", "state": "absent" } 8218 1726776709.56719: no more pending results, returning what we have 8218 1726776709.56722: results queue empty 8218 1726776709.56723: checking for any_errors_fatal 8218 1726776709.56734: done checking for any_errors_fatal 8218 1726776709.56735: checking for max_fail_percentage 8218 1726776709.56736: done checking for max_fail_percentage 8218 1726776709.56737: checking to see if all hosts have failed and the running result is not ok 8218 1726776709.56737: done checking to see if all hosts have failed 8218 1726776709.56738: getting the remaining hosts for this loop 8218 1726776709.56739: done getting the remaining hosts for this loop 8218 1726776709.56742: getting the next task for host managed_node2 8218 1726776709.56748: done getting next task for host managed_node2 8218 1726776709.56750: ^ task is: TASK: Get active_profile 8218 1726776709.56753: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776709.56756: getting variables 8218 1726776709.56757: in VariableManager get_vars() 8218 1726776709.56792: Calling all_inventory to load vars for managed_node2 8218 1726776709.56795: Calling groups_inventory to load vars for managed_node2 8218 1726776709.56797: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776709.56806: Calling all_plugins_play to load vars for managed_node2 8218 1726776709.56808: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776709.56810: Calling groups_plugins_play to load vars for managed_node2 8218 1726776709.56927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776709.57044: done with get_vars() 8218 1726776709.57053: done getting variables TASK [Get active_profile] ****************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 Thursday 19 September 2024 16:11:49 -0400 (0:00:00.332) 0:01:35.401 **** 8218 1726776709.57120: entering _queue_task() for managed_node2/slurp 8218 1726776709.57287: worker is 1 (out of 1 available) 8218 1726776709.57301: exiting _queue_task() for managed_node2/slurp 8218 1726776709.57312: done queuing things up, now waiting for results queue to drain 8218 1726776709.57314: waiting for pending results... 11545 1726776709.57447: running TaskExecutor() for managed_node2/TASK: Get active_profile 11545 1726776709.57547: in run() - task 120fa90a-8a95-cec2-986e-000000000cb0 11545 1726776709.57563: variable 'ansible_search_path' from source: unknown 11545 1726776709.57568: variable 'ansible_search_path' from source: unknown 11545 1726776709.57596: calling self._execute() 11545 1726776709.57668: variable 'ansible_host' from source: host vars for 'managed_node2' 11545 1726776709.57677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11545 1726776709.57686: variable 'omit' from source: magic vars 11545 1726776709.57763: variable 'omit' from source: magic vars 11545 1726776709.57794: variable 'omit' from source: magic vars 11545 1726776709.57814: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11545 1726776709.58033: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11545 1726776709.58092: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11545 1726776709.58120: variable 'omit' from source: magic vars 11545 1726776709.58154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11545 1726776709.58180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11545 1726776709.58199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11545 1726776709.58275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11545 1726776709.58286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11545 1726776709.58310: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11545 1726776709.58315: variable 'ansible_host' from source: host vars for 'managed_node2' 11545 1726776709.58320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11545 1726776709.58386: Set connection var ansible_connection to ssh 11545 1726776709.58394: Set connection var ansible_pipelining to False 11545 1726776709.58400: Set connection var ansible_timeout to 10 11545 1726776709.58408: Set connection var ansible_module_compression to ZIP_DEFLATED 11545 1726776709.58413: Set connection var ansible_shell_type to sh 11545 1726776709.58418: Set connection var ansible_shell_executable to /bin/sh 11545 1726776709.58435: variable 'ansible_shell_executable' from source: unknown 11545 1726776709.58439: variable 'ansible_connection' from source: unknown 11545 1726776709.58442: variable 'ansible_module_compression' from source: unknown 11545 1726776709.58446: variable 'ansible_shell_type' from source: unknown 11545 1726776709.58449: variable 'ansible_shell_executable' from source: unknown 11545 1726776709.58453: variable 'ansible_host' from source: host vars for 'managed_node2' 11545 1726776709.58457: variable 'ansible_pipelining' from source: unknown 11545 1726776709.58460: variable 'ansible_timeout' from source: unknown 11545 1726776709.58464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11545 1726776709.58600: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11545 1726776709.58611: variable 'omit' from source: magic vars 11545 1726776709.58617: starting attempt loop 11545 1726776709.58621: running the handler 11545 1726776709.58635: _low_level_execute_command(): starting 11545 1726776709.58643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11545 1726776709.60943: stdout chunk (state=2): >>>/root <<< 11545 1726776709.61061: stderr chunk (state=3): >>><<< 11545 1726776709.61067: stdout chunk (state=3): >>><<< 11545 1726776709.61087: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11545 1726776709.61100: _low_level_execute_command(): starting 11545 1726776709.61106: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971 `" && echo ansible-tmp-1726776709.6109533-11545-209227790672971="` echo /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971 `" ) && sleep 0' 11545 1726776709.63637: stdout chunk (state=2): >>>ansible-tmp-1726776709.6109533-11545-209227790672971=/root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971 <<< 11545 1726776709.63762: stderr chunk (state=3): >>><<< 11545 1726776709.63770: stdout chunk (state=3): >>><<< 11545 1726776709.63784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776709.6109533-11545-209227790672971=/root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971 , stderr= 11545 1726776709.63818: variable 'ansible_module_compression' from source: unknown 11545 1726776709.63851: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 11545 1726776709.63883: variable 'ansible_facts' from source: unknown 11545 1726776709.63942: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971/AnsiballZ_slurp.py 11545 1726776709.64036: Sending initial data 11545 1726776709.64043: Sent initial data (153 bytes) 11545 1726776709.66515: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpffbk6cow /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971/AnsiballZ_slurp.py <<< 11545 1726776709.67572: stderr chunk (state=3): >>><<< 11545 1726776709.67583: stdout chunk (state=3): >>><<< 11545 1726776709.67604: done transferring module to remote 11545 1726776709.67616: _low_level_execute_command(): starting 11545 1726776709.67621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971/ /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971/AnsiballZ_slurp.py && sleep 0' 11545 1726776709.69999: stderr chunk (state=2): >>><<< 11545 1726776709.70007: stdout chunk (state=2): >>><<< 11545 1726776709.70021: _low_level_execute_command() done: rc=0, stdout=, stderr= 11545 1726776709.70025: _low_level_execute_command(): starting 11545 1726776709.70032: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971/AnsiballZ_slurp.py && sleep 0' 11545 1726776709.84928: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11545 1726776709.85984: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11545 1726776709.86033: stderr chunk (state=3): >>><<< 11545 1726776709.86039: stdout chunk (state=3): >>><<< 11545 1726776709.86055: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.12.75 closed. 11545 1726776709.86082: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11545 1726776709.86094: _low_level_execute_command(): starting 11545 1726776709.86100: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776709.6109533-11545-209227790672971/ > /dev/null 2>&1 && sleep 0' 11545 1726776709.88517: stderr chunk (state=2): >>><<< 11545 1726776709.88524: stdout chunk (state=2): >>><<< 11545 1726776709.88541: _low_level_execute_command() done: rc=0, stdout=, stderr= 11545 1726776709.88549: handler run complete 11545 1726776709.88562: attempt loop complete, returning result 11545 1726776709.88566: _execute() done 11545 1726776709.88571: dumping result to json 11545 1726776709.88576: done dumping result, returning 11545 1726776709.88582: done running TaskExecutor() for managed_node2/TASK: Get active_profile [120fa90a-8a95-cec2-986e-000000000cb0] 11545 1726776709.88589: sending task result for task 120fa90a-8a95-cec2-986e-000000000cb0 11545 1726776709.88616: done sending task result for task 120fa90a-8a95-cec2-986e-000000000cb0 11545 1726776709.88620: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8218 1726776709.88744: no more pending results, returning what we have 8218 1726776709.88747: results queue empty 8218 1726776709.88748: checking for any_errors_fatal 8218 1726776709.88756: done checking for any_errors_fatal 8218 1726776709.88757: checking for max_fail_percentage 8218 1726776709.88758: done checking for max_fail_percentage 8218 1726776709.88759: checking to see if all hosts have failed and the running result is not ok 8218 1726776709.88759: done checking to see if all hosts have failed 8218 1726776709.88760: getting the remaining hosts for this loop 8218 1726776709.88761: done getting the remaining hosts for this loop 8218 1726776709.88764: getting the next task for host managed_node2 8218 1726776709.88770: done getting next task for host managed_node2 8218 1726776709.88772: ^ task is: TASK: Ensure kernel_settings is not in active_profile 8218 1726776709.88775: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776709.88778: getting variables 8218 1726776709.88779: in VariableManager get_vars() 8218 1726776709.88812: Calling all_inventory to load vars for managed_node2 8218 1726776709.88815: Calling groups_inventory to load vars for managed_node2 8218 1726776709.88816: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776709.88826: Calling all_plugins_play to load vars for managed_node2 8218 1726776709.88830: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776709.88833: Calling groups_plugins_play to load vars for managed_node2 8218 1726776709.88962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776709.89122: done with get_vars() 8218 1726776709.89132: done getting variables 8218 1726776709.89176: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings is not in active_profile] ************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Thursday 19 September 2024 16:11:49 -0400 (0:00:00.320) 0:01:35.722 **** 8218 1726776709.89197: entering _queue_task() for managed_node2/copy 8218 1726776709.89364: worker is 1 (out of 1 available) 8218 1726776709.89377: exiting _queue_task() for managed_node2/copy 8218 1726776709.89389: done queuing things up, now waiting for results queue to drain 8218 1726776709.89390: waiting for pending results... 11553 1726776709.89524: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings is not in active_profile 11553 1726776709.89628: in run() - task 120fa90a-8a95-cec2-986e-000000000cb1 11553 1726776709.89645: variable 'ansible_search_path' from source: unknown 11553 1726776709.89649: variable 'ansible_search_path' from source: unknown 11553 1726776709.89679: calling self._execute() 11553 1726776709.89755: variable 'ansible_host' from source: host vars for 'managed_node2' 11553 1726776709.89764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11553 1726776709.89775: variable 'omit' from source: magic vars 11553 1726776709.89854: variable 'omit' from source: magic vars 11553 1726776709.89886: variable 'omit' from source: magic vars 11553 1726776709.89908: variable '__active_profile' from source: task vars 11553 1726776709.90133: variable '__active_profile' from source: task vars 11553 1726776709.90280: variable '__cur_profile' from source: task vars 11553 1726776709.90389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11553 1726776709.91914: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11553 1726776709.92248: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11553 1726776709.92279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11553 1726776709.92305: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11553 1726776709.92326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11553 1726776709.92383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11553 1726776709.92405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11553 1726776709.92423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11553 1726776709.92452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11553 1726776709.92464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11553 1726776709.92539: variable '__kernel_settings_tuned_current_profile' from source: set_fact 11553 1726776709.92583: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11553 1726776709.92639: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11553 1726776709.92694: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11553 1726776709.92715: variable 'omit' from source: magic vars 11553 1726776709.92737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11553 1726776709.92757: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11553 1726776709.92776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11553 1726776709.92789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11553 1726776709.92799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11553 1726776709.92821: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11553 1726776709.92826: variable 'ansible_host' from source: host vars for 'managed_node2' 11553 1726776709.92832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11553 1726776709.92898: Set connection var ansible_connection to ssh 11553 1726776709.92906: Set connection var ansible_pipelining to False 11553 1726776709.92912: Set connection var ansible_timeout to 10 11553 1726776709.92919: Set connection var ansible_module_compression to ZIP_DEFLATED 11553 1726776709.92924: Set connection var ansible_shell_type to sh 11553 1726776709.92931: Set connection var ansible_shell_executable to /bin/sh 11553 1726776709.92947: variable 'ansible_shell_executable' from source: unknown 11553 1726776709.92951: variable 'ansible_connection' from source: unknown 11553 1726776709.92954: variable 'ansible_module_compression' from source: unknown 11553 1726776709.92958: variable 'ansible_shell_type' from source: unknown 11553 1726776709.92961: variable 'ansible_shell_executable' from source: unknown 11553 1726776709.92964: variable 'ansible_host' from source: host vars for 'managed_node2' 11553 1726776709.92970: variable 'ansible_pipelining' from source: unknown 11553 1726776709.92974: variable 'ansible_timeout' from source: unknown 11553 1726776709.92978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11553 1726776709.93042: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11553 1726776709.93054: variable 'omit' from source: magic vars 11553 1726776709.93059: starting attempt loop 11553 1726776709.93063: running the handler 11553 1726776709.93075: _low_level_execute_command(): starting 11553 1726776709.93082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11553 1726776709.95398: stdout chunk (state=2): >>>/root <<< 11553 1726776709.95524: stderr chunk (state=3): >>><<< 11553 1726776709.95533: stdout chunk (state=3): >>><<< 11553 1726776709.95553: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11553 1726776709.95566: _low_level_execute_command(): starting 11553 1726776709.95574: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535 `" && echo ansible-tmp-1726776709.9556186-11553-228454576721535="` echo /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535 `" ) && sleep 0' 11553 1726776709.98270: stdout chunk (state=2): >>>ansible-tmp-1726776709.9556186-11553-228454576721535=/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535 <<< 11553 1726776709.98397: stderr chunk (state=3): >>><<< 11553 1726776709.98404: stdout chunk (state=3): >>><<< 11553 1726776709.98418: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776709.9556186-11553-228454576721535=/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535 , stderr= 11553 1726776709.98487: variable 'ansible_module_compression' from source: unknown 11553 1726776709.98531: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11553 1726776709.98562: variable 'ansible_facts' from source: unknown 11553 1726776709.98632: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_stat.py 11553 1726776709.98715: Sending initial data 11553 1726776709.98722: Sent initial data (152 bytes) 11553 1726776710.01221: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpntv3uhzz /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_stat.py <<< 11553 1726776710.02400: stderr chunk (state=3): >>><<< 11553 1726776710.02407: stdout chunk (state=3): >>><<< 11553 1726776710.02423: done transferring module to remote 11553 1726776710.02435: _low_level_execute_command(): starting 11553 1726776710.02440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/ /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_stat.py && sleep 0' 11553 1726776710.04773: stderr chunk (state=2): >>><<< 11553 1726776710.04781: stdout chunk (state=2): >>><<< 11553 1726776710.04794: _low_level_execute_command() done: rc=0, stdout=, stderr= 11553 1726776710.04798: _low_level_execute_command(): starting 11553 1726776710.04803: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_stat.py && sleep 0' 11553 1726776710.21085: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776709.8476226, "mtime": 1726776707.2946131, "ctime": 1726776707.2946131, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11553 1726776710.22206: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11553 1726776710.22257: stderr chunk (state=3): >>><<< 11553 1726776710.22264: stdout chunk (state=3): >>><<< 11553 1726776710.22283: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726776709.8476226, "mtime": 1726776707.2946131, "ctime": 1726776707.2946131, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "500822512", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11553 1726776710.22320: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11553 1726776710.22405: Sending initial data 11553 1726776710.22413: Sent initial data (141 bytes) 11553 1726776710.24959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpd68beavo /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source <<< 11553 1726776710.25297: stderr chunk (state=3): >>><<< 11553 1726776710.25303: stdout chunk (state=3): >>><<< 11553 1726776710.25322: _low_level_execute_command(): starting 11553 1726776710.25328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/ /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source && sleep 0' 11553 1726776710.27625: stderr chunk (state=2): >>><<< 11553 1726776710.27634: stdout chunk (state=2): >>><<< 11553 1726776710.27646: _low_level_execute_command() done: rc=0, stdout=, stderr= 11553 1726776710.27665: variable 'ansible_module_compression' from source: unknown 11553 1726776710.27697: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11553 1726776710.27714: variable 'ansible_facts' from source: unknown 11553 1726776710.27775: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_copy.py 11553 1726776710.27854: Sending initial data 11553 1726776710.27861: Sent initial data (152 bytes) 11553 1726776710.30290: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpa1vbdeyn /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_copy.py <<< 11553 1726776710.31355: stderr chunk (state=3): >>><<< 11553 1726776710.31361: stdout chunk (state=3): >>><<< 11553 1726776710.31378: done transferring module to remote 11553 1726776710.31386: _low_level_execute_command(): starting 11553 1726776710.31393: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/ /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_copy.py && sleep 0' 11553 1726776710.33681: stderr chunk (state=2): >>><<< 11553 1726776710.33688: stdout chunk (state=2): >>><<< 11553 1726776710.33700: _low_level_execute_command() done: rc=0, stdout=, stderr= 11553 1726776710.33704: _low_level_execute_command(): starting 11553 1726776710.33709: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/AnsiballZ_copy.py && sleep 0' 11553 1726776710.50440: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source", "_original_basename": "tmpd68beavo", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11553 1726776710.51656: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11553 1726776710.51703: stderr chunk (state=3): >>><<< 11553 1726776710.51711: stdout chunk (state=3): >>><<< 11553 1726776710.51726: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source", "_original_basename": "tmpd68beavo", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11553 1726776710.51753: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source', '_original_basename': 'tmpd68beavo', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11553 1726776710.51763: _low_level_execute_command(): starting 11553 1726776710.51771: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/ > /dev/null 2>&1 && sleep 0' 11553 1726776710.54179: stderr chunk (state=2): >>><<< 11553 1726776710.54186: stdout chunk (state=2): >>><<< 11553 1726776710.54200: _low_level_execute_command() done: rc=0, stdout=, stderr= 11553 1726776710.54208: handler run complete 11553 1726776710.54226: attempt loop complete, returning result 11553 1726776710.54232: _execute() done 11553 1726776710.54235: dumping result to json 11553 1726776710.54241: done dumping result, returning 11553 1726776710.54247: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings is not in active_profile [120fa90a-8a95-cec2-986e-000000000cb1] 11553 1726776710.54253: sending task result for task 120fa90a-8a95-cec2-986e-000000000cb1 11553 1726776710.54285: done sending task result for task 120fa90a-8a95-cec2-986e-000000000cb1 11553 1726776710.54289: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "src": "/root/.ansible/tmp/ansible-tmp-1726776709.9556186-11553-228454576721535/source", "state": "file", "uid": 0 } 8218 1726776710.54424: no more pending results, returning what we have 8218 1726776710.54427: results queue empty 8218 1726776710.54428: checking for any_errors_fatal 8218 1726776710.54437: done checking for any_errors_fatal 8218 1726776710.54438: checking for max_fail_percentage 8218 1726776710.54439: done checking for max_fail_percentage 8218 1726776710.54440: checking to see if all hosts have failed and the running result is not ok 8218 1726776710.54441: done checking to see if all hosts have failed 8218 1726776710.54441: getting the remaining hosts for this loop 8218 1726776710.54442: done getting the remaining hosts for this loop 8218 1726776710.54445: getting the next task for host managed_node2 8218 1726776710.54451: done getting next task for host managed_node2 8218 1726776710.54452: ^ task is: TASK: Set profile_mode to auto 8218 1726776710.54455: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776710.54458: getting variables 8218 1726776710.54459: in VariableManager get_vars() 8218 1726776710.54493: Calling all_inventory to load vars for managed_node2 8218 1726776710.54495: Calling groups_inventory to load vars for managed_node2 8218 1726776710.54497: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776710.54506: Calling all_plugins_play to load vars for managed_node2 8218 1726776710.54509: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776710.54511: Calling groups_plugins_play to load vars for managed_node2 8218 1726776710.54634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776710.54749: done with get_vars() 8218 1726776710.54759: done getting variables 8218 1726776710.54801: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set profile_mode to auto] ************************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 Thursday 19 September 2024 16:11:50 -0400 (0:00:00.656) 0:01:36.378 **** 8218 1726776710.54822: entering _queue_task() for managed_node2/copy 8218 1726776710.54982: worker is 1 (out of 1 available) 8218 1726776710.54997: exiting _queue_task() for managed_node2/copy 8218 1726776710.55010: done queuing things up, now waiting for results queue to drain 8218 1726776710.55011: waiting for pending results... 11571 1726776710.55145: running TaskExecutor() for managed_node2/TASK: Set profile_mode to auto 11571 1726776710.55246: in run() - task 120fa90a-8a95-cec2-986e-000000000cb2 11571 1726776710.55263: variable 'ansible_search_path' from source: unknown 11571 1726776710.55270: variable 'ansible_search_path' from source: unknown 11571 1726776710.55298: calling self._execute() 11571 1726776710.55371: variable 'ansible_host' from source: host vars for 'managed_node2' 11571 1726776710.55379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11571 1726776710.55388: variable 'omit' from source: magic vars 11571 1726776710.55472: variable 'omit' from source: magic vars 11571 1726776710.55503: variable 'omit' from source: magic vars 11571 1726776710.55525: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 11571 1726776710.55743: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 11571 1726776710.55805: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11571 1726776710.55898: variable 'omit' from source: magic vars 11571 1726776710.55931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11571 1726776710.55957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11571 1726776710.55978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11571 1726776710.55992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11571 1726776710.56004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11571 1726776710.56026: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11571 1726776710.56034: variable 'ansible_host' from source: host vars for 'managed_node2' 11571 1726776710.56038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11571 1726776710.56106: Set connection var ansible_connection to ssh 11571 1726776710.56114: Set connection var ansible_pipelining to False 11571 1726776710.56121: Set connection var ansible_timeout to 10 11571 1726776710.56128: Set connection var ansible_module_compression to ZIP_DEFLATED 11571 1726776710.56136: Set connection var ansible_shell_type to sh 11571 1726776710.56142: Set connection var ansible_shell_executable to /bin/sh 11571 1726776710.56156: variable 'ansible_shell_executable' from source: unknown 11571 1726776710.56160: variable 'ansible_connection' from source: unknown 11571 1726776710.56163: variable 'ansible_module_compression' from source: unknown 11571 1726776710.56166: variable 'ansible_shell_type' from source: unknown 11571 1726776710.56172: variable 'ansible_shell_executable' from source: unknown 11571 1726776710.56176: variable 'ansible_host' from source: host vars for 'managed_node2' 11571 1726776710.56180: variable 'ansible_pipelining' from source: unknown 11571 1726776710.56183: variable 'ansible_timeout' from source: unknown 11571 1726776710.56187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11571 1726776710.56280: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11571 1726776710.56292: variable 'omit' from source: magic vars 11571 1726776710.56298: starting attempt loop 11571 1726776710.56302: running the handler 11571 1726776710.56312: _low_level_execute_command(): starting 11571 1726776710.56319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11571 1726776710.58639: stdout chunk (state=2): >>>/root <<< 11571 1726776710.58757: stderr chunk (state=3): >>><<< 11571 1726776710.58764: stdout chunk (state=3): >>><<< 11571 1726776710.58786: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11571 1726776710.58799: _low_level_execute_command(): starting 11571 1726776710.58805: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671 `" && echo ansible-tmp-1726776710.5879476-11571-22826531079671="` echo /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671 `" ) && sleep 0' 11571 1726776710.61348: stdout chunk (state=2): >>>ansible-tmp-1726776710.5879476-11571-22826531079671=/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671 <<< 11571 1726776710.61478: stderr chunk (state=3): >>><<< 11571 1726776710.61484: stdout chunk (state=3): >>><<< 11571 1726776710.61499: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776710.5879476-11571-22826531079671=/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671 , stderr= 11571 1726776710.61566: variable 'ansible_module_compression' from source: unknown 11571 1726776710.61609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11571 1726776710.61642: variable 'ansible_facts' from source: unknown 11571 1726776710.61708: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_stat.py 11571 1726776710.61791: Sending initial data 11571 1726776710.61799: Sent initial data (151 bytes) 11571 1726776710.64312: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmplw01_5rd /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_stat.py <<< 11571 1726776710.65356: stderr chunk (state=3): >>><<< 11571 1726776710.65363: stdout chunk (state=3): >>><<< 11571 1726776710.65382: done transferring module to remote 11571 1726776710.65392: _low_level_execute_command(): starting 11571 1726776710.65396: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/ /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_stat.py && sleep 0' 11571 1726776710.67760: stderr chunk (state=2): >>><<< 11571 1726776710.67767: stdout chunk (state=2): >>><<< 11571 1726776710.67779: _low_level_execute_command() done: rc=0, stdout=, stderr= 11571 1726776710.67783: _low_level_execute_command(): starting 11571 1726776710.67788: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_stat.py && sleep 0' 11571 1726776710.84100: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776705.5206063, "mtime": 1726776707.295613, "ctime": 1726776707.295613, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11571 1726776710.85298: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11571 1726776710.85346: stderr chunk (state=3): >>><<< 11571 1726776710.85353: stdout chunk (state=3): >>><<< 11571 1726776710.85371: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726776705.5206063, "mtime": 1726776707.295613, "ctime": 1726776707.295613, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1809538096", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.12.75 closed. 11571 1726776710.85413: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11571 1726776710.85500: Sending initial data 11571 1726776710.85507: Sent initial data (140 bytes) 11571 1726776710.88071: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpfc07irjz /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source <<< 11571 1726776710.88409: stderr chunk (state=3): >>><<< 11571 1726776710.88415: stdout chunk (state=3): >>><<< 11571 1726776710.88437: _low_level_execute_command(): starting 11571 1726776710.88443: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/ /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source && sleep 0' 11571 1726776710.90775: stderr chunk (state=2): >>><<< 11571 1726776710.90781: stdout chunk (state=2): >>><<< 11571 1726776710.90794: _low_level_execute_command() done: rc=0, stdout=, stderr= 11571 1726776710.90813: variable 'ansible_module_compression' from source: unknown 11571 1726776710.90849: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11571 1726776710.90871: variable 'ansible_facts' from source: unknown 11571 1726776710.90932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_copy.py 11571 1726776710.91013: Sending initial data 11571 1726776710.91020: Sent initial data (151 bytes) 11571 1726776710.93482: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmp7l5ye5b4 /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_copy.py <<< 11571 1726776710.94562: stderr chunk (state=3): >>><<< 11571 1726776710.94568: stdout chunk (state=3): >>><<< 11571 1726776710.94586: done transferring module to remote 11571 1726776710.94595: _low_level_execute_command(): starting 11571 1726776710.94600: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/ /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_copy.py && sleep 0' 11571 1726776710.96923: stderr chunk (state=2): >>><<< 11571 1726776710.96932: stdout chunk (state=2): >>><<< 11571 1726776710.96945: _low_level_execute_command() done: rc=0, stdout=, stderr= 11571 1726776710.96949: _low_level_execute_command(): starting 11571 1726776710.96954: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/AnsiballZ_copy.py && sleep 0' 11571 1726776711.13492: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source", "_original_basename": "tmpfc07irjz", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11571 1726776711.14671: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11571 1726776711.14726: stderr chunk (state=3): >>><<< 11571 1726776711.14736: stdout chunk (state=3): >>><<< 11571 1726776711.14754: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source", "_original_basename": "tmpfc07irjz", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11571 1726776711.14784: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source', '_original_basename': 'tmpfc07irjz', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11571 1726776711.14796: _low_level_execute_command(): starting 11571 1726776711.14803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/ > /dev/null 2>&1 && sleep 0' 11571 1726776711.17267: stderr chunk (state=2): >>><<< 11571 1726776711.17276: stdout chunk (state=2): >>><<< 11571 1726776711.17290: _low_level_execute_command() done: rc=0, stdout=, stderr= 11571 1726776711.17299: handler run complete 11571 1726776711.17318: attempt loop complete, returning result 11571 1726776711.17322: _execute() done 11571 1726776711.17325: dumping result to json 11571 1726776711.17332: done dumping result, returning 11571 1726776711.17339: done running TaskExecutor() for managed_node2/TASK: Set profile_mode to auto [120fa90a-8a95-cec2-986e-000000000cb2] 11571 1726776711.17346: sending task result for task 120fa90a-8a95-cec2-986e-000000000cb2 11571 1726776711.17380: done sending task result for task 120fa90a-8a95-cec2-986e-000000000cb2 11571 1726776711.17385: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "src": "/root/.ansible/tmp/ansible-tmp-1726776710.5879476-11571-22826531079671/source", "state": "file", "uid": 0 } 8218 1726776711.17522: no more pending results, returning what we have 8218 1726776711.17525: results queue empty 8218 1726776711.17526: checking for any_errors_fatal 8218 1726776711.17537: done checking for any_errors_fatal 8218 1726776711.17538: checking for max_fail_percentage 8218 1726776711.17539: done checking for max_fail_percentage 8218 1726776711.17540: checking to see if all hosts have failed and the running result is not ok 8218 1726776711.17540: done checking to see if all hosts have failed 8218 1726776711.17541: getting the remaining hosts for this loop 8218 1726776711.17542: done getting the remaining hosts for this loop 8218 1726776711.17545: getting the next task for host managed_node2 8218 1726776711.17551: done getting next task for host managed_node2 8218 1726776711.17553: ^ task is: TASK: Restart tuned 8218 1726776711.17555: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8218 1726776711.17559: getting variables 8218 1726776711.17560: in VariableManager get_vars() 8218 1726776711.17596: Calling all_inventory to load vars for managed_node2 8218 1726776711.17599: Calling groups_inventory to load vars for managed_node2 8218 1726776711.17601: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776711.17609: Calling all_plugins_play to load vars for managed_node2 8218 1726776711.17612: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776711.17614: Calling groups_plugins_play to load vars for managed_node2 8218 1726776711.17778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776711.17891: done with get_vars() 8218 1726776711.17900: done getting variables 8218 1726776711.17946: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restart tuned] *********************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 Thursday 19 September 2024 16:11:51 -0400 (0:00:00.631) 0:01:37.010 **** 8218 1726776711.17970: entering _queue_task() for managed_node2/service 8218 1726776711.18136: worker is 1 (out of 1 available) 8218 1726776711.18152: exiting _queue_task() for managed_node2/service 8218 1726776711.18163: done queuing things up, now waiting for results queue to drain 8218 1726776711.18165: waiting for pending results... 11589 1726776711.18296: running TaskExecutor() for managed_node2/TASK: Restart tuned 11589 1726776711.18396: in run() - task 120fa90a-8a95-cec2-986e-000000000cb3 11589 1726776711.18413: variable 'ansible_search_path' from source: unknown 11589 1726776711.18417: variable 'ansible_search_path' from source: unknown 11589 1726776711.18452: variable '__kernel_settings_services' from source: include_vars 11589 1726776711.18692: variable '__kernel_settings_services' from source: include_vars 11589 1726776711.18752: variable 'omit' from source: magic vars 11589 1726776711.18847: variable 'ansible_host' from source: host vars for 'managed_node2' 11589 1726776711.18857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11589 1726776711.18866: variable 'omit' from source: magic vars 11589 1726776711.18919: variable 'omit' from source: magic vars 11589 1726776711.18951: variable 'omit' from source: magic vars 11589 1726776711.18980: variable 'item' from source: unknown 11589 1726776711.19036: variable 'item' from source: unknown 11589 1726776711.19057: variable 'omit' from source: magic vars 11589 1726776711.19088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11589 1726776711.19113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11589 1726776711.19135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11589 1726776711.19150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11589 1726776711.19161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11589 1726776711.19184: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11589 1726776711.19190: variable 'ansible_host' from source: host vars for 'managed_node2' 11589 1726776711.19194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11589 1726776711.19262: Set connection var ansible_connection to ssh 11589 1726776711.19270: Set connection var ansible_pipelining to False 11589 1726776711.19278: Set connection var ansible_timeout to 10 11589 1726776711.19285: Set connection var ansible_module_compression to ZIP_DEFLATED 11589 1726776711.19290: Set connection var ansible_shell_type to sh 11589 1726776711.19295: Set connection var ansible_shell_executable to /bin/sh 11589 1726776711.19310: variable 'ansible_shell_executable' from source: unknown 11589 1726776711.19313: variable 'ansible_connection' from source: unknown 11589 1726776711.19317: variable 'ansible_module_compression' from source: unknown 11589 1726776711.19320: variable 'ansible_shell_type' from source: unknown 11589 1726776711.19323: variable 'ansible_shell_executable' from source: unknown 11589 1726776711.19327: variable 'ansible_host' from source: host vars for 'managed_node2' 11589 1726776711.19333: variable 'ansible_pipelining' from source: unknown 11589 1726776711.19336: variable 'ansible_timeout' from source: unknown 11589 1726776711.19340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11589 1726776711.19431: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11589 1726776711.19442: variable 'omit' from source: magic vars 11589 1726776711.19448: starting attempt loop 11589 1726776711.19452: running the handler 11589 1726776711.19516: variable 'ansible_facts' from source: unknown 11589 1726776711.19606: _low_level_execute_command(): starting 11589 1726776711.19615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11589 1726776711.21916: stdout chunk (state=2): >>>/root <<< 11589 1726776711.22045: stderr chunk (state=3): >>><<< 11589 1726776711.22051: stdout chunk (state=3): >>><<< 11589 1726776711.22069: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11589 1726776711.22083: _low_level_execute_command(): starting 11589 1726776711.22089: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845 `" && echo ansible-tmp-1726776711.220764-11589-154265567313845="` echo /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845 `" ) && sleep 0' 11589 1726776711.24617: stdout chunk (state=2): >>>ansible-tmp-1726776711.220764-11589-154265567313845=/root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845 <<< 11589 1726776711.24741: stderr chunk (state=3): >>><<< 11589 1726776711.24747: stdout chunk (state=3): >>><<< 11589 1726776711.24762: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776711.220764-11589-154265567313845=/root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845 , stderr= 11589 1726776711.24784: variable 'ansible_module_compression' from source: unknown 11589 1726776711.24823: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-82185rtlnsy0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11589 1726776711.24872: variable 'ansible_facts' from source: unknown 11589 1726776711.25024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845/AnsiballZ_systemd.py 11589 1726776711.25122: Sending initial data 11589 1726776711.25132: Sent initial data (154 bytes) 11589 1726776711.27575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-82185rtlnsy0/tmpry15v5lz /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845/AnsiballZ_systemd.py <<< 11589 1726776711.29476: stderr chunk (state=3): >>><<< 11589 1726776711.29485: stdout chunk (state=3): >>><<< 11589 1726776711.29508: done transferring module to remote 11589 1726776711.29519: _low_level_execute_command(): starting 11589 1726776711.29524: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845/ /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845/AnsiballZ_systemd.py && sleep 0' 11589 1726776711.31870: stderr chunk (state=2): >>><<< 11589 1726776711.31878: stdout chunk (state=2): >>><<< 11589 1726776711.31892: _low_level_execute_command() done: rc=0, stdout=, stderr= 11589 1726776711.31896: _low_level_execute_command(): starting 11589 1726776711.31901: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845/AnsiballZ_systemd.py && sleep 0' 11589 1726776711.59495: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "23007232", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11589 1726776711.59532: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChange<<< 11589 1726776711.59543: stdout chunk (state=3): >>>TimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11589 1726776711.61052: stderr chunk (state=3): >>>Shared connection to 10.31.12.75 closed. <<< 11589 1726776711.61099: stderr chunk (state=3): >>><<< 11589 1726776711.61105: stdout chunk (state=3): >>><<< 11589 1726776711.61124: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "659", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "23007232", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.12.75 closed. 11589 1726776711.61263: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11589 1726776711.61285: _low_level_execute_command(): starting 11589 1726776711.61292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776711.220764-11589-154265567313845/ > /dev/null 2>&1 && sleep 0' 11589 1726776711.63636: stderr chunk (state=2): >>><<< 11589 1726776711.63643: stdout chunk (state=2): >>><<< 11589 1726776711.63656: _low_level_execute_command() done: rc=0, stdout=, stderr= 11589 1726776711.63663: handler run complete 11589 1726776711.63697: attempt loop complete, returning result 11589 1726776711.63714: variable 'item' from source: unknown 11589 1726776711.63776: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ActiveEnterTimestampMonotonic": "7869987", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.socket basic.target polkit.service system.slice dbus.service network.target systemd-sysctl.service systemd-journald.socket sysinit.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:57 EDT", "AssertTimestampMonotonic": "7087951", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ConditionTimestampMonotonic": "7087950", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service auto-cpufreq.service tlp.service power-profiles-daemon.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "659", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:57 EDT", "ExecMainStartTimestampMonotonic": "7089596", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:57 EDT] ; stop_time=[n/a] ; pid=659 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:57 EDT", "InactiveExitTimestampMonotonic": "7089644", "InvocationID": "be99157d46d44d9c81f1119a7f4d9aa7", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "659", "MemoryAccounting": "yes", "MemoryCurrent": "23007232", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "sysinit.target dbus.socket system.slice dbus.service", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:57 EDT", "StateChangeTimestampMonotonic": "7869987", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:57 EDT", "WatchdogTimestampMonotonic": "7869983", "WatchdogUSec": "0" } } 11589 1726776711.63873: dumping result to json 11589 1726776711.63891: done dumping result, returning 11589 1726776711.63899: done running TaskExecutor() for managed_node2/TASK: Restart tuned [120fa90a-8a95-cec2-986e-000000000cb3] 11589 1726776711.63905: sending task result for task 120fa90a-8a95-cec2-986e-000000000cb3 11589 1726776711.64010: done sending task result for task 120fa90a-8a95-cec2-986e-000000000cb3 11589 1726776711.64015: WORKER PROCESS EXITING 8218 1726776711.64341: no more pending results, returning what we have 8218 1726776711.64344: results queue empty 8218 1726776711.64344: checking for any_errors_fatal 8218 1726776711.64348: done checking for any_errors_fatal 8218 1726776711.64349: checking for max_fail_percentage 8218 1726776711.64350: done checking for max_fail_percentage 8218 1726776711.64350: checking to see if all hosts have failed and the running result is not ok 8218 1726776711.64351: done checking to see if all hosts have failed 8218 1726776711.64351: getting the remaining hosts for this loop 8218 1726776711.64352: done getting the remaining hosts for this loop 8218 1726776711.64354: getting the next task for host managed_node2 8218 1726776711.64359: done getting next task for host managed_node2 8218 1726776711.64360: ^ task is: TASK: meta (flush_handlers) 8218 1726776711.64362: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776711.64365: getting variables 8218 1726776711.64366: in VariableManager get_vars() 8218 1726776711.64388: Calling all_inventory to load vars for managed_node2 8218 1726776711.64390: Calling groups_inventory to load vars for managed_node2 8218 1726776711.64391: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776711.64398: Calling all_plugins_play to load vars for managed_node2 8218 1726776711.64399: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776711.64401: Calling groups_plugins_play to load vars for managed_node2 8218 1726776711.64505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776711.64613: done with get_vars() 8218 1726776711.64621: done getting variables 8218 1726776711.64672: in VariableManager get_vars() 8218 1726776711.64681: Calling all_inventory to load vars for managed_node2 8218 1726776711.64683: Calling groups_inventory to load vars for managed_node2 8218 1726776711.64684: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776711.64686: Calling all_plugins_play to load vars for managed_node2 8218 1726776711.64688: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776711.64689: Calling groups_plugins_play to load vars for managed_node2 8218 1726776711.64972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776711.65072: done with get_vars() 8218 1726776711.65081: done queuing things up, now waiting for results queue to drain 8218 1726776711.65082: results queue empty 8218 1726776711.65082: checking for any_errors_fatal 8218 1726776711.65086: done checking for any_errors_fatal 8218 1726776711.65087: checking for max_fail_percentage 8218 1726776711.65087: done checking for max_fail_percentage 8218 1726776711.65088: checking to see if all hosts have failed and the running result is not ok 8218 1726776711.65088: done checking to see if all hosts have failed 8218 1726776711.65088: getting the remaining hosts for this loop 8218 1726776711.65089: done getting the remaining hosts for this loop 8218 1726776711.65090: getting the next task for host managed_node2 8218 1726776711.65093: done getting next task for host managed_node2 8218 1726776711.65094: ^ task is: TASK: meta (flush_handlers) 8218 1726776711.65095: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776711.65098: getting variables 8218 1726776711.65098: in VariableManager get_vars() 8218 1726776711.65105: Calling all_inventory to load vars for managed_node2 8218 1726776711.65106: Calling groups_inventory to load vars for managed_node2 8218 1726776711.65107: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776711.65110: Calling all_plugins_play to load vars for managed_node2 8218 1726776711.65111: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776711.65113: Calling groups_plugins_play to load vars for managed_node2 8218 1726776711.65187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776711.65285: done with get_vars() 8218 1726776711.65291: done getting variables 8218 1726776711.65318: in VariableManager get_vars() 8218 1726776711.65325: Calling all_inventory to load vars for managed_node2 8218 1726776711.65326: Calling groups_inventory to load vars for managed_node2 8218 1726776711.65327: Calling all_plugins_inventory to load vars for managed_node2 8218 1726776711.65331: Calling all_plugins_play to load vars for managed_node2 8218 1726776711.65332: Calling groups_plugins_inventory to load vars for managed_node2 8218 1726776711.65334: Calling groups_plugins_play to load vars for managed_node2 8218 1726776711.65406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8218 1726776711.65502: done with get_vars() 8218 1726776711.65510: done queuing things up, now waiting for results queue to drain 8218 1726776711.65511: results queue empty 8218 1726776711.65511: checking for any_errors_fatal 8218 1726776711.65513: done checking for any_errors_fatal 8218 1726776711.65513: checking for max_fail_percentage 8218 1726776711.65513: done checking for max_fail_percentage 8218 1726776711.65514: checking to see if all hosts have failed and the running result is not ok 8218 1726776711.65514: done checking to see if all hosts have failed 8218 1726776711.65514: getting the remaining hosts for this loop 8218 1726776711.65515: done getting the remaining hosts for this loop 8218 1726776711.65516: getting the next task for host managed_node2 8218 1726776711.65518: done getting next task for host managed_node2 8218 1726776711.65518: ^ task is: None 8218 1726776711.65519: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8218 1726776711.65520: done queuing things up, now waiting for results queue to drain 8218 1726776711.65520: results queue empty 8218 1726776711.65520: checking for any_errors_fatal 8218 1726776711.65521: done checking for any_errors_fatal 8218 1726776711.65521: checking for max_fail_percentage 8218 1726776711.65522: done checking for max_fail_percentage 8218 1726776711.65522: checking to see if all hosts have failed and the running result is not ok 8218 1726776711.65522: done checking to see if all hosts have failed 8218 1726776711.65524: getting the next task for host managed_node2 8218 1726776711.65525: done getting next task for host managed_node2 8218 1726776711.65526: ^ task is: None 8218 1726776711.65526: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=135 changed=19 unreachable=0 failed=0 skipped=58 rescued=0 ignored=0 Thursday 19 September 2024 16:11:51 -0400 (0:00:00.476) 0:01:37.486 **** =============================================================================== Reboot the machine - see if settings persist after reboot -------------- 27.17s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:95 Ensure required packages are installed ---------------------------------- 5.39s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:22 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 5.19s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.89s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.82s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.82s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.81s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Gathering Facts --------------------------------------------------------- 1.79s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:2 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.51s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.51s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.50s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.49s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Ensure required services are enabled and started ------------------------ 1.08s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:51 fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 0.88s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Generate a configuration for kernel settings ---------------------------- 0.78s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:45 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.75s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.73s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.71s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.71s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.70s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 8218 1726776711.65617: RUNNING CLEANUP